Sample records for quantitative risk evaluation

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Quantitative Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helms, J.

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less

  3. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  4. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...

  5. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  6. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  7. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention.

    PubMed

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2010-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan's current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 x 10(-8) (95th percentile: 3.20 x 10(-7)). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures.

  8. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    PubMed Central

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  9. Development of quantitative risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesmeyer, J. M.; Okrent, D.

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  10. IWGT report on quantitative approaches to genotoxicity risk ...

    EPA Pesticide Factsheets

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast

  11. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  12. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  13. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  14. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  15. Quantitative Risks

    DTIC Science & Technology

    2015-02-24

    Quantitative Risks Technical Report SERC -2015-TR-040-4 February 24, 2015 Principal Investigator: Dr. Gary Witus, Wayne State...0007, RT 107 Report No. SERC -2015-TR-040-4 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...Research Center ( SERC ) is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology. This material is

  16. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  17. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  18. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  19. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  20. A quantitative risk-assessment system (QR-AS) evaluating operation safety of Organic Rankine Cycle using flammable mixture working fluid.

    PubMed

    Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan

    2017-09-15

    Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  2. Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.

    PubMed

    Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans

    2017-01-01

    The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.

  3. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  4. Quantitative influence of risk factors on blood glucose level.

    PubMed

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  5. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  6. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  8. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  9. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  10. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas.

  11. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  12. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  13. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  14. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  15. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  16. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  17. Benchmarking on the evaluation of major accident-related risk assessment.

    PubMed

    Fabbri, Luciano; Contini, Sergio

    2009-03-15

    This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union.

  18. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  19. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  20. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  1. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. Copyright 2009 Elsevier B.V. All rights reserved.

  2. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less

  3. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  4. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  5. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was

  6. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  8. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    PubMed Central

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  9. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  10. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  11. Communicating radon risk effectively: a mid-course evaluation. Interim report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, V.K.; Desvousges, W.H.; Fisher, A.

    A panel of 2300 homeowners was divided into subgroups to test the effectiveness of six alternative ways of explaining the risk from naturally occurring radon gas. The research design focused on two dimensions: qualitative vs. quantitative and directive vs. evaluative. These characteristics led to 4 experimental booklets, which were compared with EPA's Citizen's Guide and a one-page fact sheet. The evaluation examined how much people learned about radon; whether they could form risk perceptions consistent with their home's measured radon level; and whether they felt they had enough information to make a decision about mitigation. The fact sheet did notmore » perform well on any of these evaluation criteria. None of the five booklets clearly was best for all 3 evaluation criteria; the report discusses the implications for designing an effective radon-risk communication program.« less

  12. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  13. Evaluation of Historical and Projected Agricultural Climate Risk Over the Continental US

    NASA Astrophysics Data System (ADS)

    Zhu, X.; Troy, T. J.; Devineni, N.

    2016-12-01

    Food demands are rising due to an increasing population with changing food preferences, which places pressure on agricultural systems. In addition, in the past decade climate extremes have highlighted the vulnerability of our agricultural production to climate variability. Quantitative analyses in the climate-agriculture research field have been performed in many studies. However, climate risk still remains difficult to evaluate at large scales yet shows great potential of help us better understand historical climate change impacts and evaluate the future risk given climate projections. In this study, we developed a framework to evaluate climate risk quantitatively by applying statistical methods such as Bayesian regression, distribution fitting, and Monte Carlo simulation. We applied the framework over different climate regions in the continental US both historically and for modeled climate projections. The relative importance of any major growing season climate index, such as maximum dry period or heavy precipitation, was evaluated to determine what climate indices play a role in affecting crop yields. The statistical modeling framework was applied using county yields, with irrigated and rainfed yields separated to evaluate the different risk. This framework provides estimates of the climate risk facing agricultural production in the near-term that account for the full uncertainty of climate occurrences, range of crop response, and spatial correlation in climate. In particular, the method provides robust estimates of importance of irrigation in mitigating agricultural climate risk. The results of this study can contribute to decision making about crop choice and water use in an uncertain climate.

  14. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  15. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  16. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  17. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk

  18. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  19. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.

  20. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    PubMed

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  1. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ...] Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review AGENCY: Food and Drug... availability of a draft report entitled ``Quantitative Summary of the Benefits and Risks of Prescription Drugs... ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review.'' A literature...

  2. Quantitative Microbial Risk Assessment and Infectious Disease Transmission Modeling of Waterborne Enteric Pathogens.

    PubMed

    Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S

    2018-04-20

    Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.

  3. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  4. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  5. Development of a semi-quantitative risk assessment model for evaluating environmental threat posed by the three first EU watch-list pharmaceuticals to urban wastewater treatment plants: An Irish case study.

    PubMed

    Tahar, Alexandre; Tiedeken, Erin Jo; Clifford, Eoghan; Cummins, Enda; Rowan, Neil

    2017-12-15

    Contamination of receiving waters with pharmaceutical compounds is of pressing concern. This constitutes the first study to report on the development of a semi-quantitative risk assessment (RA) model for evaluating the environmental threat posed by three EU watch list pharmaceutical compounds namely, diclofenac, 17-beta-estradiol and 17-alpha-ethinylestradiol, to aquatic ecosystems using Irish data as a case study. This RA model adopts the Irish Environmental Protection Agency Source-Pathway-Receptor concept to define relevant parameters for calculating low, medium or high risk score for each agglomeration of wastewater treatment plant (WWTP), which include catchment, treatments, operational and management factors. This RA model may potentially be used on a national scale to (i) identify WWTPs that pose a particular risk as regards releasing disproportionally high levels of these pharmaceutical compounds, and (ii) help identify priority locations for introducing or upgrading control measures (e.g. tertiary treatment, source reduction). To assess risks for these substances of emerging concern, the model was applied to 16 urban WWTPs located in different regions in Ireland that were scored for the three different compounds and ranked as low, medium or high risk. As a validation proxy, this case study used limited monitoring data recorded at some these plants receiving waters. It is envisaged that this semi-quantitative RA approach may aid other EU countries investigate and screen for potential risks where limited measured or predicted environmental pollutant concentrations and/or hydrological data are available. This model is semi-quantitative, as other factors such as influence of climate change and drug usage or prescription data will need to be considered in a future point for estimating and predicting risks. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Quantitative prediction of oral cancer risk in patients with oral leukoplakia.

    PubMed

    Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng

    2017-07-11

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.

  7. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  8. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, Michael; Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data

  9. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  10. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  11. Asbestos exposure--quantitative assessment of risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, J.M.; Weill, H.

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under considerationmore » by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.« less

  12. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  13. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  14. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  15. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  16. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  17. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  18. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator.

    PubMed

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J; Zhang, Li-Qun

    2011-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke.

  19. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  20. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  1. Characterizing health risks associated with recreational swimming at Taiwanese beaches by using quantitative microbial risk assessment.

    PubMed

    Jang, Cheng-Shin; Liang, Ching-Ping

    2018-01-01

    Taiwan is surrounded by oceans, and therefore numerous pleasure beaches attract millions of tourists annually to participate in recreational swimming activities. However, impaired water quality because of fecal pollution poses a potential threat to the tourists' health. This study probabilistically characterized the health risks associated with recreational swimming engendered by waterborne enterococci at 13 Taiwanese beaches by using quantitative microbial risk assessment. First, data on enterococci concentrations at coastal beaches monitored by the Taiwan Environmental Protection Administration were reproduced using nonparametric Monte Carlo simulation (MCS). The ingestion volumes of recreational swimming based on uniform and gamma distributions were subsequently determined using MCS. Finally, after the distribution combination of the two parameters, the beta-Poisson dose-response function was employed to quantitatively estimate health risks to recreational swimmers. Moreover, various levels of risk to recreational swimmers were classified and spatially mapped to explore feasible recreational and environmental management strategies at the beaches. The study results revealed that although the health risks associated with recreational swimming did not exceed an acceptable benchmark of 0.019 illnesses daily at all beaches, they approached to this benchmark at certain beaches. Beaches with relatively high risks are located in Northwestern Taiwan owing to the current movements.

  2. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  3. Quantitative Gait Markers and Incident Fall Risk in Older Adults

    PubMed Central

    Holtzer, Roee; Lipton, Richard B.; Wang, Cuiling

    2009-01-01

    Background Identifying quantitative gait markers of falls in older adults may improve diagnostic assessments and suggest novel intervention targets. Methods We studied 597 adults aged 70 and older (mean age 80.5 years, 62% women) enrolled in an aging study who received quantitative gait assessments at baseline. Association of speed and six other gait markers (cadence, stride length, swing, double support, stride length variability, and swing time variability) with incident fall rate was studied using generalized estimation equation procedures adjusted for age, sex, education, falls, chronic illnesses, medications, cognition, disability as well as traditional clinical tests of gait and balance. Results Over a mean follow-up period of 20 months, 226 (38%) of the 597 participants fell. Mean fall rate was 0.44 per person-year. Slower gait speed (risk ratio [RR] per 10 cm/s decrease 1.069, 95% confidence interval [CI] 1.001–1.142) was associated with higher risk of falls in the fully adjusted models. Among six other markers, worse performance on swing (RR 1.406, 95% CI 1.027–1.926), double-support phase (RR 1.165, 95% CI 1.026–1.321), swing time variability (RR 1.007, 95% CI 1.004–1.010), and stride length variability (RR 1.076, 95% CI 1.030–1.111) predicted fall risk. The associations remained significant even after accounting for cognitive impairment and disability. Conclusions Quantitative gait markers are independent predictors of falls in older adults. Gait speed and other markers, especially variability, should be further studied to improve current fall risk assessments and to develop new interventions. PMID:19349593

  4. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  5. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  6. Evaluating Risk Communication After the Fukushima Disaster Based on Nudge Theory.

    PubMed

    Murakami, Michio; Tsubokura, Masaharu

    2017-03-01

    Using nudge theory and some examples of risk communication that followed the Fukushima disaster, this article discusses the influences and justifications of risk communication, in addition to how risk communication systems are designed. To assist people in making decisions based on their own value systems, we provide three suggestions, keeping in mind that people can be influenced (ie, "nudged") depending on how risk communication takes place: (1) accumulate knowledge on the process of evaluating how the method of risk communication and a system's default design could impact people; (2) clarify the purpose and outcomes of risk communication; and (3) see what risk communication might be ethically unjustifiable. Quantitative studies on risk communication and collective narratives will provide some ideas for how to design better risk communication systems and to help people make decisions. Furthermore, we have shown examples of unjustifiable risk communication.

  7. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  8. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  9. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  10. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  11. Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.

    PubMed

    Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K

    2017-02-01

    Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.

  12. Quantitative microbial risk assessment of microbial source tracking markers in recreational water contaminated with fresh untreated and secondary treated sewage.

    PubMed

    Ahmed, Warish; Hamilton, Kerry A; Lobos, Aldo; Hughes, Bridie; Staley, Christopher; Sadowsky, Michael J; Harwood, Valerie J

    2018-05-14

    Microbial source tracking (MST) methods have provided the means to identify sewage contamination in recreational waters, but the risk associated with elevated levels of MST targets such as sewage-associated Bacteroides HF183 and other markers is uncertain. Quantitative microbial risk assessment (QMRA) modeling allows interpretation of MST data in the context of the risk of gastrointestinal (GI) illness caused by exposure to known reference pathogens. In this study, five sewage-associated, quantitative PCR (qPCR) MST markers [Bacteroides HF183 (HF183), Methanobrevibacter smithii nifH (nifH), human adenovirus (HAdV), human polyomavirus (HPyV) and pepper mild mottle virus (PMMoV)] were evaluated to determine at what concentration these nucleic acid markers reflected a significant health risk from exposure to fresh untreated or secondary treated sewage in beach water. The QMRA models were evaluated for a target probability of illness of 36 GI illnesses/1000 swimming events (i.e., risk benchmark 0.036) for the reference pathogens norovirus (NoV) and human adenovirus 40/41 (HAdV 40/41). Sewage markers at several dilutions exceeded the risk benchmark for reference pathogens NoV and HAdV 40/41. HF183 concentrations 3.22 × 10 3 (for both NoV and HAdV 40/41) gene copies (GC)/100 mL of water contaminated with fresh untreated sewage represented risk >0.036. Similarly, HF183 concentrations 3.66 × 10 3 (for NoV and HAdV 40/41) GC/100 mL of water contaminated with secondary treated sewage represented risk >0.036. HAdV concentration as low as 4.11 × 10 1 GC/100 mL of water represented risk >0.036 when water was contaminated with secondary treated sewage. Results of this study provide a valuable context for water quality managers to evaluate human health risks associated with contamination from fresh sewage. The approach described here may also be useful in the future for evaluating health risks from contamination with aged or treated sewage or feces from other

  13. Evaluation of bacterial pathogen diversity, abundance and health risks in urban recreational water by amplicon next-generation sequencing and quantitative PCR.

    PubMed

    Cui, Qijia; Fang, Tingting; Huang, Yong; Dong, Peiyan; Wang, Hui

    2017-07-01

    The microbial quality of urban recreational water is of great concern to public health. The monitoring of indicator organisms and several pathogens alone is not sufficient to accurately and comprehensively identify microbial risks. To assess the levels of bacterial pathogens and health risks in urban recreational water, we analyzed pathogen diversity and quantified four pathogens in 46 water samples collected from waterbodies in Beijing Olympic Forest Park in one year. The pathogen diversity revealed by 16S rRNA gene targeted next-generation sequencing (NGS) showed that 16 of 40 genera and 13 of 76 reference species were present. The most abundant species were Acinetobacter johnsonii, Mycobacterium avium and Aeromonas spp. Quantitative polymerase chain reaction (qPCR) of Escherichia coli (uidA), Aeromonas (aerA), M. avium (16S rRNA), Pseudomonas aeruginosa (oaa) and Salmonella (invA) showed that the aerA genes were the most abundant, occurring in all samples with concentrations of 10 4-6 genome copies/100mL, followed by oaa, invA and M. avium. In total, 34.8% of the samples harbored all genes, indicating the prevalence of these pathogens in this recreational waterbody. Based on the qPCR results, a quantitative microbial risk assessment (QMRA) showed that the annual infection risks of Salmonella, M. avium and P. aeruginosa in five activities were mostly greater than the U.S. EPA risk limit for recreational contacts, and children playing with water may be exposed to the greatest infection risk. Our findings provide a comprehensive understanding of bacterial pathogen diversity and pathogen abundance in urban recreational water by applying both NGS and qPCR. Copyright © 2016. Published by Elsevier B.V.

  14. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  15. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  16. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    PubMed

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  17. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    DTIC Science & Technology

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi...1 COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE 1. INTRODUCTION Members of the U.S. Environmental...Generator 4 2.4 Experimental Design Each quantitative method was performed three times on three consecutive days. For the CD runs, three

  18. 76 FR 19311 - Update of the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From Foodborne... quantitative targets established in ``Healthy People 2010.'' In 2005, FoodNet data showed 0.30 L. monocytogenes... 4). In 2003, FDA and FSIS published a quantitative assessment of the relative risk to public health...

  19. Nuclear medicine and quantitative imaging research (instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1990-09-01

    This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less

  20. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  1. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  2. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  3. Risk-Based Evaluation of Total Petroleum Hydrocarbons in Vapor Intrusion Studies

    PubMed Central

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-01-01

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum. PMID:23765191

  4. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  5. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  6. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  7. A framework and case studies for evaluation of enzyme ontogeny in children's health risk evaluation.

    PubMed

    Ginsberg, Gary; Vulimiri, Suryanarayana V; Lin, Yu-Sheng; Kancherla, Jayaram; Foos, Brenda; Sonawane, Babasaheb

    2017-01-01

    Knowledge of the ontogeny of Phase I and Phase II metabolizing enzymes may be used to inform children's vulnerability based upon likely differences in internal dose from xenobiotic exposure. This might provide a qualitative assessment of toxicokinetic (TK) variability and uncertainty pertinent to early lifestages and help scope a more quantitative physiologically based toxicokinetic (PBTK) assessment. Although much is known regarding the ontogeny of metabolizing systems, this is not commonly utilized in scoping and problem formulation stage of human health risk evaluation. A framework is proposed for introducing this information into problem formulation which combines data on enzyme ontogeny and chemical-specific TK to explore potential child/adult differences in internal dose and whether such metabolic differences may be important factors in risk evaluation. The framework is illustrated with five case study chemicals, including some which are data rich and provide proof of concept, while others are data poor. Case studies for toluene and chlorpyrifos indicate potentially important child/adult TK differences while scoping for acetaminophen suggests enzyme ontogeny is unlikely to increase early-life risks. Scoping for trichloroethylene and aromatic amines indicates numerous ways that enzyme ontogeny may affect internal dose which necessitates further evaluation. PBTK modeling is a critical and feasible next step to further evaluate child-adult differences in internal dose for a number of these chemicals.

  8. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    PubMed Central

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  9. Quantitative phase-contrast digital holographic microscopy for cell dynamic evaluation

    NASA Astrophysics Data System (ADS)

    Yu, Lingfeng; Mohanty, Samarendra; Berns, Michael W.; Chen, Zhongping

    2009-02-01

    The laser microbeam uses lasers to alter and/or to ablate intracellular organelles and cellular and tissue samples, and, today, has become an important tool for cell biologists to study the molecular mechanism of complex biological systems by removing individual cells or sub-cellular organelles. However, absolute quantitation of the localized alteration/damage to transparent phase objects, such as the cell membrane or chromosomes, was not possible using conventional phase-contrast or differential interference contrast microscopy. We report the development of phase-contrast digital holographic microscopy for quantitative evaluation of cell dynamic changes in real time during laser microsurgery. Quantitative phase images are recorded during the process of laser microsurgery and thus, the dynamic change in phase can be continuously evaluated. Out-of-focus organelles are re-focused by numerical reconstruction algorithms.

  10. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  11. Supply chain risk management of newspaper industry: A quantitative study

    NASA Astrophysics Data System (ADS)

    Sartika, Viny; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    The newspaper industry has several distinctive features that make it stands out from other industries. The strict delivery deadline and zero inventory led to a very short time frame for production and distribution. On the other hand, there is pressure from the newsroom to encourage the start of production as slowly as possible in order to enter the news, while there is pressure from production and distribution to start production as early as possible. Supply chain risk management is needed in determining the best strategy for dealing with possible risks in the newspaper industry. In a case study of a newspaper in Surakarta, quantitative approaches are made to the newspaper supply chain risk management by calculating the expected cost of risk based on the magnitude of the impact and the probability of a risk event. From the calculation results obtained that the five risks with the highest value are newspaper delays to the end customer, broken plate, miss print, down machine, and delayed delivery of newspaper content. Then analyzed appropriate mitigation strategies to cope with such risk events.

  12. Methods of quantitative risk assessment: The case of the propellant supply system

    NASA Astrophysics Data System (ADS)

    Merz, H. A.; Bienz, A.

    1984-08-01

    As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.

  13. Quantitative Assessment of Cancer Risk from Exposure to Diesel Engine Emissions

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. This model accounted for rat-hum...

  14. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions

  15. [Evaluation of medication risk in pregnant women: methodology of evaluation and risk management].

    PubMed

    Eléfant, E; Sainte-Croix, A

    1997-01-01

    This round table discussion was devoted to the description of the tools currently available for the evaluation of drug risks and management during pregnancy. Five topics were submitted for discussion: pre-clinical data, methodological tools, benefit/risk ratio before prescription, teratogenic or fetal risk evaluation, legal comments.

  16. Advances in Imaging Approaches to Fracture Risk Evaluation

    PubMed Central

    Manhard, Mary Kate; Nyman, Jeffry S.; Does, Mark D.

    2016-01-01

    Fragility fractures are a growing problem worldwide, and current methods for diagnosing osteoporosis do not always identify individuals who require treatment to prevent a fracture and may misidentify those not a risk. Traditionally, fracture risk is assessed using dual-energy X-ray absorptiometry, which provides measurements of areal bone mineral density (BMD) at sites prone to fracture. Recent advances in imaging show promise in adding new information that could improve the prediction of fracture risk in the clinic. As reviewed herein, advances in quantitative computed tomography (QCT) predict hip and vertebral body strength; high resolution HR-peripheral QCT (HR-pQCT) and micro-magnetic resonance imaging (μMRI) assess the micro-architecture of trabecular bone; quantitative ultrasound (QUS) measures the modulus or tissue stiffness of cortical bone; and quantitative ultra-short echo time MRI methods quantify the concentrations of bound water and pore water in cortical bone, which reflect a variety of mechanical properties of bone. Each of these technologies provides unique characteristics of bone and may improve fracture risk diagnoses and reduce prevalence of fractures by helping to guide treatment decisions. PMID:27816505

  17. Using a quantitative risk register to promote learning from a patient safety reporting system.

    PubMed

    Mansfield, James G; Caplan, Robert A; Campos, John S; Dreis, David F; Furman, Cathie

    2015-02-01

    Patient safety reporting systems are now used in most health care delivery organizations. These systems, such as the one in use at Virginia Mason (Seattle) since 2002, can provide valuable reports of risk and harm from the front lines of patient care. In response to the challenge of how to quantify and prioritize safety opportunities, a risk register system was developed and implemented. Basic risk register concepts were refined to provide a systematic way to understand risks reported by staff. The risk register uses a comprehensive taxonomy of patient risk and algorithmically assigns each patient safety report to 1 of 27 risk categories in three major domains (Evaluation, Treatment, and Critical Interactions). For each category, a composite score was calculated on the basis of event rate, harm, and cost. The composite scores were used to identify the "top five" risk categories, and patient safety reports in these categories were analyzed in greater depth to find recurrent patterns of risk and associated opportunities for improvement. The top five categories of risk were easy to identify and had distinctive "profiles" of rate, harm, and cost. The ability to categorize and rank risks across multiple dimensions yielded insights not previously available. These results were shared with leadership and served as input for planning quality and safety initiatives. This approach provided actionable input for the strategic planning process, while at the same time strengthening the Virginia Mason culture of safety. The quantitative patient safety risk register serves as one solution to the challenge of extracting valuable safety lessons from large numbers of incident reports and could profitably be adopted by other organizations.

  18. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  19. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, B.D.; Toole, A.P.; Callahan, B.G.

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromaticmore » ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.« less

  20. Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi-Directional Road Tunnel.

    PubMed

    Caliendo, Ciro; De Guglielmo, Maria Luisa

    2017-01-01

    A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs. © 2016 Society for Risk Analysis.

  1. EVALUATING TOOLS AND MODELS USED FOR QUANTITATIVE EXTRAPOLATION OF IN VITRO TO IN VIVO DATA FOR NEUROTOXICANTS*

    EPA Science Inventory

    There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...

  2. Quantitative metrics for evaluating the phased roll-out of clinical information systems.

    PubMed

    Wong, David; Wu, Nicolas; Watkinson, Peter

    2017-09-01

    We introduce a novel quantitative approach for evaluating the order of roll-out during phased introduction of clinical information systems. Such roll-outs are associated with unavoidable risk due to patients transferring between clinical areas using both the old and new systems. We proposed a simple graphical model of patient flow through a hospital. Using a simple instance of the model, we showed how a roll-out order can be generated by minimising the flow of patients from the new system to the old system. The model was applied to admission and discharge data acquired from 37,080 patient journeys at the Churchill Hospital, Oxford between April 2013 and April 2014. The resulting order was evaluated empirically and produced acceptable orders. The development of data-driven approaches to clinical Information system roll-out provides insights that may not necessarily be ascertained through clinical judgment alone. Such methods could make a significant contribution to the smooth running of an organisation during the roll-out of a potentially disruptive technology. Unlike previous approaches, which are based on clinical opinion, the approach described here quantitatively assesses the appropriateness of competing roll-out strategies. The data-driven approach was shown to produce strategies that matched clinical intuition and provides a flexible framework that may be used to plan and monitor Clinical Information System roll-out. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  4. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  5. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-01-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  6. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-04-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  7. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  8. Using quantitative risk information in decisions about statins: a qualitative study in a community setting.

    PubMed

    Polak, Louisa; Green, Judith

    2015-04-01

    A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. To understand the role of quantitative risk information in patients' accounts of decisions about taking statins. This was a qualitative study, with participants recruited and interviewed in community settings. Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as 'necessary' either to treat test results, or because of personalised, unequivocal advice from a doctor. This study's findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. © British Journal of General Practice 2015.

  9. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  10. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  11. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  13. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  14. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  15. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  16. Comparing listeriosis risks in at-risk populations using a user-friendly quantitative microbial risk assessment tool and epidemiological data.

    PubMed

    Falk, L E; Fader, K A; Cui, D S; Totton, S C; Fazil, A M; Lammerding, A M; Smith, B A

    2016-10-01

    Although infection by the pathogenic bacterium Listeria monocytogenes is relatively rare, consequences can be severe, with a high case-fatality rate in vulnerable populations. A quantitative, probabilistic risk assessment tool was developed to compare estimates of the number of invasive listeriosis cases in vulnerable Canadian subpopulations given consumption of contaminated ready-to-eat delicatessen meats and hot dogs, under various user-defined scenarios. The model incorporates variability and uncertainty through Monte Carlo simulation. Processes considered within the model include cross-contamination, growth, risk factor prevalence, subpopulation susceptibilities, and thermal inactivation. Hypothetical contamination events were simulated. Results demonstrated varying risk depending on the consumer risk factors and implicated product (turkey delicatessen meat without growth inhibitors ranked highest for this scenario). The majority (80%) of listeriosis cases were predicted in at-risk subpopulations comprising only 20% of the total Canadian population, with the greatest number of predicted cases in the subpopulation with dialysis and/or liver disease. This tool can be used to simulate conditions and outcomes under different scenarios, such as a contamination event and/or outbreak, to inform public health interventions.

  17. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  18. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  19. Quantitative assessment of the risk of rabies entering Japan through the importation of dogs and cats from the USA.

    PubMed

    Kamakawa, H; Koiwai, M; Satomura, S; Eto, M; Sugiura, K

    2009-08-01

    Up to October 2004, dogs and cats imported into Japan were subjected to a quarantine regimen which consisted of vaccination and a 30- to 365-day waiting period in the country of origin and a 14-day quarantine period upon arrival in Japan. This regimen was replaced by a new one, consisting of vaccination, antibody level titration and a 180-day waiting period in the country of origin, in November 2004. To evaluate the effect of this policy change, a quantitative risk assessment was undertaken. The risk of rabies entering Japan through the importation of dogs and cats from the USA under the old - and new - regimens was quantitatively assessed and compared. Under the new regimen, rabies will enter Japan once every 4932 years (90% confidence interval 1812-13 412 years) through the importation of dogs and cats from the USA. Under the old regimen, rabies would enter Japan once every 70 years (39-205 years), 83 years (45-267 years) or 190 years (104-609 years) assuming that the animal departs the country of origin 30 days, 180 days or 365 days after vaccination, respectively. This indicates the policy change would reduce the risk by a factor of 1/25-1/70.

  20. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  1. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  2. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  3. Quantitative assessment of human health risk posed by polycyclic aromatic hydrocarbons in urban road dust.

    PubMed

    Ma, Yukun; Liu, An; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2017-01-01

    Among the numerous pollutants present in urban road dust, polycyclic aromatic hydrocarbons (PAHs) are among the most toxic chemical pollutants and can pose cancer risk to humans. The primary aim of the study was to develop a quantitative model to assess the cancer risk from PAHs in urban road dust based on traffic and land use factors and thereby to characterise the risk posed by PAHs in fine (<150μm) and coarse (>150μm) particles. The risk posed by PAHs was quantified as incremental lifetime cancer risk (ILCR), which was modelled as a function of traffic volume and percentages of different urban land uses. The study outcomes highlighted the fact that cancer risk from PAHs in urban road dust is primarily influenced by PAHs associated with fine solids. Heavy PAHs with 5 to 6 benzene rings, especially dibenzo[a,h]anthracene (D[a]A) and benzo[a]pyrene (B[a]P) in the mixture contribute most to the risk. The quantitative model developed based on traffic and land use factors will contribute to informed decision making in relation to the management of risk posed by PAHs in urban road dust. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  5. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  6. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    PubMed

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Estimating the incremental net health benefit of requirements for cardiovascular risk evaluation for diabetes therapies.

    PubMed

    Chawla, Anita J; Mytelka, Daniel S; McBride, Stephan D; Nellesen, Dave; Elkins, Benjamin R; Ball, Daniel E; Kalsekar, Anupama; Towse, Adrian; Garrison, Louis P

    2014-03-01

    To evaluate the advantages and disadvantages of pre-approval requirements for safety data to detect cardiovascular (CV) risk contained in the December 2008 U.S. Food and Drug Administration (FDA) guidance for developing type 2 diabetes drugs compared with the February 2008 FDA draft guidance from the perspective of diabetes population health. We applied the incremental net health benefit (INHB) framework to quantify the benefits and risks of investigational diabetes drugs using a common survival metric (life-years [LYs]). We constructed a decision analytic model for clinical program development consistent with the requirements of each guidance and simulated diabetes drugs, some of which had elevated CV risk. Assuming constant research budgets, we estimate the impact of increased trial size on drugs investigated. We aggregate treatment benefit and CV risks for each approved drug over a 35-year horizon under each guidance. The quantitative analysis suggests that the December 2008 guidance adversely impacts diabetes population health. INHB was -1.80 million LYs, attributable to delayed access to diabetes therapies (-0 .18 million LYs) and fewer drugs (-1.64 million LYs), but partially offset by reduced CV risk exposure (0.02 million LYs). Results were robust in sensitivity analyses. The health outcomes impact of all potential benefits and risks should be evaluated in a common survival measure, including health gain from avoided adverse events, lost health benefits from delayed or for gone efficacious products, and impact of alternative policy approaches. Quantitative analysis of the December 2008 FDA guidance for diabetes therapies indicates that negative impact on patient health will result. Copyright © 2014 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd.

  8. Quantitative Risk Mapping of Urban Gas Pipeline Networks Using GIS

    NASA Astrophysics Data System (ADS)

    Azari, P.; Karimi, M.

    2017-09-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  9. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  10. Skin sensitisation quantitative risk assessment (QRA) based on aggregate dermal exposure to methylisothiazolinone in personal care and household cleaning products.

    PubMed

    Ezendam, J; Bokkers, B G H; Bil, W; Delmaar, J E

    2018-02-01

    Contact allergy to preservatives is an important public health problem. Ideally, new substances should be evaluated for the risk on skin sensitisation before market entry, for example by using a quantitative risk assessment (QRA) as developed for fragrances. As a proof-of-concept, this QRA was applied to the preservative methylisothiazolinone (MI), a common cause of contact allergy. MI is used in different consumer products, including personal care products (PCPs) and household cleaning products (HCPs). Aggregate exposure to MI in PCPs and HCPs was therefore assessed with the Probabilistic Aggregated Consumer Exposure Model (PACEM). Two exposure scenarios were evaluated: scenario 1 calculated aggregate exposure on actual MI product concentrations before the restricted use in PCPs and scenario 2 calculated aggregate exposure using the restrictions for MI in PCPs. The QRA for MI showed that in scenarios 1 and 2, the proportion of the population at risk for skin sensitisation is 0.7% and 0.5%, respectively. The restricted use of MI in PCPs does not seem very effective in lowering the risk on skin sensitization. To conclude, it is important to consider aggregate exposure from the most important consumer products into consideration in the risk assessment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Quantitative Evaluation of Performance during Robot-assisted Treatment.

    PubMed

    Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.

  12. A new method to evaluate image quality of CBCT images quantitatively without observers

    PubMed Central

    Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori

    2017-01-01

    Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343

  13. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  14. A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.

    PubMed

    Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui

    2017-10-01

    Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.

  15. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  16. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  17. A Systematic Quantitative-Qualitative Model: How To Evaluate Professional Services

    ERIC Educational Resources Information Center

    Yoda, Koji

    1973-01-01

    The proposed evaluation model provides for the assignment of relative weights to each criterion, and establishes a weighting system for calculating a quantitative-qualitative raw score for each service activity of a faculty member being reviewed. (Author)

  18. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  19. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  20. Towards a Quantitative Framework for Evaluating Vulnerability of Drinking Water Wells to Contamination from Unconventional Oil & Gas Development

    NASA Astrophysics Data System (ADS)

    Soriano, M., Jr.; Deziel, N. C.; Saiers, J. E.

    2017-12-01

    The rapid expansion of unconventional oil and gas (UO&G) production, made possible by advances in hydraulic fracturing (fracking), has triggered concerns over risks this extraction poses to water resources and public health. Concerns are particularly acute within communities that host UO&G development and rely heavily on shallow aquifers as sources of drinking water. This research aims to develop a quantitative framework to evaluate the vulnerability of drinking water wells to contamination from UO&G activities. The concept of well vulnerability is explored through application of backwards travel time probability modeling to estimate the likelihood that capture zones of drinking water wells circumscribe source locations of UO&G contamination. Sources of UO&G contamination considered in this analysis include gas well pads and documented sites of UO&G wastewater and chemical spills. The modeling approach is illustrated for a portion of Susquehanna County, Pennsylvania, where more than one thousand shale gas wells have been completed since 2005. Data from a network of eight multi-level groundwater monitoring wells installed in the study site in 2015 are used to evaluate the model. The well vulnerability concept is proposed as a physically based quantitative tool for policy-makers dealing with the management of contamination risks of drinking water wells. In particular, the model can be used to identify adequate setback distances of UO&G activities from drinking water wells and other critical receptors.

  1. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  2. Estimating the incremental net health benefit of requirements for cardiovascular risk evaluation for diabetes therapies

    PubMed Central

    Chawla, Anita J; Mytelka, Daniel S; McBride, Stephan D; Nellesen, Dave; Elkins, Benjamin R; Ball, Daniel E; Kalsekar, Anupama; Towse, Adrian; Garrison, Louis P

    2014-01-01

    Purpose To evaluate the advantages and disadvantages of pre-approval requirements for safety data to detect cardiovascular (CV) risk contained in the December 2008 U.S. Food and Drug Administration (FDA) guidance for developing type 2 diabetes drugs compared with the February 2008 FDA draft guidance from the perspective of diabetes population health. Methods We applied the incremental net health benefit (INHB) framework to quantify the benefits and risks of investigational diabetes drugs using a common survival metric (life-years [LYs]). We constructed a decision analytic model for clinical program development consistent with the requirements of each guidance and simulated diabetes drugs, some of which had elevated CV risk. Assuming constant research budgets, we estimate the impact of increased trial size on drugs investigated. We aggregate treatment benefit and CV risks for each approved drug over a 35-year horizon under each guidance. Results The quantitative analysis suggests that the December 2008 guidance adversely impacts diabetes population health. INHB was −1.80 million LYs, attributable to delayed access to diabetes therapies (−0.18 million LYs) and fewer drugs (−1.64 million LYs), but partially offset by reduced CV risk exposure (0.02 million LYs). Results were robust in sensitivity analyses. Conclusion The health outcomes impact of all potential benefits and risks should be evaluated in a common survival measure, including health gain from avoided adverse events, lost health benefits from delayed or forgone efficacious products, and impact of alternative policy approaches. Quantitative analysis of the December 2008 FDA guidance for diabetes therapies indicates that negative impact on patient health will result. © 2014 The Authors. Pharmacoepidemiology and Drug Safety published by John Wiley & Sons, Ltd. PMID:24892175

  3. A Framework for Quantitative Evaluation of Care Coordination Effectiveness

    ERIC Educational Resources Information Center

    Liu, Wei

    2017-01-01

    The U.S. healthcare system lacks incentives and quantitative evaluation tools to assess coordination in a patient's care transition process. This is needed because poor care coordination has been identified by many studies as one of the major root causes for the U.S. health system's inefficiency, for poor outcomes, and for high cost. Despite…

  4. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  5. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  7. Quantitative Microbial Risk Assessment of Pharmaceutical Products.

    PubMed

    Eissa, Mostafa Essam

    2017-01-01

    Monitoring of microbiological quality in the pharmaceutical industry is an important criterion that is required to justify safe product release to the drug market. Good manufacturing practice and efficient control on bioburden level of product components are critical parameters that influence the microbiological cleanliness of medicinal products. However, because microbial dispersion through the samples follows Poisson distribution, the rate of detection of microbiologically defective samples lambda (λ) decreases when the number of defective units per batch decreases. When integrating a dose-response model of infection (P inf ) of a specific objectionable microbe with a contamination module, the overall probability of infection from a single batch of pharmaceutical product can be estimated. The combination of P inf with detectability chance of the test (P det ) will yield a value that could be used as a quantitative measure of the possibility of passing contaminated batch units of product with a certain load of a specific pathogen and infecting the final consumer without being detected in the firm. The simulation study can be used to assess the risk of contamination and infection from objectionable microorganisms for sterile and non-sterile products. LAY ABSTRACT: Microbial contamination of pharmaceutical products is a global problem that may lead to infection and possibly death. While reputable pharmaceutical companies strive to deliver microbiologically safe products, it would be helpful to apply an assessment system for the current risk associated with pharmaceutical batches delivered to the drug market. The current methodology may be helpful also in determining the degree of improvement or deterioration on the batch processing flow until reaching the final consumer. Moreover, the present system is flexible and can be applied to other industries such as food, cosmetics, or medical devices manufacturing and processing fields to assess the microbiological risk of

  8. User embracement with risk classification in an emergency care unit: an evaluative study.

    PubMed

    Hermida, Patrícia Madalena Vieira; Nascimento, Eliane Regina Pereira do; Echevarría-Guanilo, Maria Elena; Brüggemann, Odaléa Maria; Malfussi, Luciana Bihain Hagemann de

    2018-01-01

    Objective Describing the evaluation of the Structure, Process and Outcome of User Embracement with Risk Classification of an Emergency Care Unit from the perspective of physicians and nurses. Method An evaluative, descriptive, quantitative study developed in Santa Catarina. Data were collected using a validated and adapted instrument consisting of 21 items distributed in the dimensions of Structure (facilities), Process (activities and relationships in providing care) and Outcome (care effects). In the analysis, descriptive statistics and the Mean Ranking and Mean Score calculations were applied. Results The sample consisted of 37 participants. From the 21 evaluated items, 11 (52.4%) had a Mean Ranking between 3 and 4, and none of them reached the maximum ranking (5 points). "Prioritization of severe cases" and "Primary care according to the severity of the case" reached a higher Mean Ranking (4.5), while "Flowchart discussion" had the lowest Ranking (2.1). The dimensions of Structure, Process and Outcome reached mean scores of 23.9, 21.9 and 25.5, respectively, indicating a Precarious evaluation (17.5 to 26.1 points). Conclusion User Embracement with Risk Classification is precarious, especially regarding the Process which obtained a lower satisfaction level from the participants.

  9. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    PubMed

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  10. Quantitative assessment of background parenchymal enhancement in breast magnetic resonance images predicts the risk of breast cancer.

    PubMed

    Hu, Xiaoxin; Jiang, Luan; Li, Qiang; Gu, Yajia

    2017-02-07

    The objective of this study was to evaluate the association betweenthe quantitative assessment of background parenchymal enhancement rate (BPER) and breast cancer. From 14,033 consecutive patients who underwent breast MRI in our center, we randomly selected 101 normal controls. Then, we selected 101 women with benign breast lesions and 101 women with breast cancer who were matched for age and menstruation status. We evaluated BPER at early (2 minutes), medium (4 minutes) and late (6 minutes) enhanced time phases of breast MRI for quantitative assessment. Odds ratios (ORs) for risk of breast cancer were calculated using the receiver operating curve. The BPER increased in a time-dependent manner after enhancement in both premenopausal and postmenopausal women. Premenopausal women had higher BPER than postmenopausal women at early, medium and late enhanced phases. In the normal population, the OR for probability of breast cancer for premenopausal women with high BPER was 4.1 (95% CI: 1.7-9.7) and 4.6 (95% CI: 1.7-12.0) for postmenopausal women. The OR of breast cancer morbidity in premenopausal women with high BPER was 2.6 (95% CI: 1.1-6.4) and 2.8 (95% CI: 1.2-6.1) for postmenopausal women. The BPER was found to be a predictive factor of breast cancer morbidity. Different time phases should be used to assess BPER in premenopausal and postmenopausal women.

  11. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  12. A quantitative assessment of risks of heavy metal residues in laundered shop towels and their use by workers.

    PubMed

    Connor, Kevin; Magee, Brian

    2014-10-01

    This paper presents a risk assessment of exposure to metal residues in laundered shop towels by workers. The concentrations of 27 metals measured in a synthetic sweat leachate were used to estimate the releasable quantity of metals which could be transferred to workers' skin. Worker exposure was evaluated quantitatively with an exposure model that focused on towel-to-hand transfer and subsequent hand-to-food or -mouth transfers. The exposure model was based on conservative, but reasonable assumptions regarding towel use and default exposure factor values from the published literature or regulatory guidance. Transfer coefficients were derived from studies representative of the exposures to towel users. Contact frequencies were based on assumed high-end use of shop towels, but constrained by a theoretical maximum dermal loading. The risk estimates for workers developed for all metals were below applicable regulatory risk benchmarks. The risk assessment for lead utilized the Adult Lead Model and concluded that predicted lead intakes do not constitute a significant health hazard based on potential worker exposures. Uncertainties are discussed in relation to the overall confidence in the exposure estimates developed for each exposure pathway and the likelihood that the exposure model is under- or overestimating worker exposures and risk. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural

  14. Quantitative evaluation of the voice range profile in patients with voice disorder.

    PubMed

    Ikeda, Y; Masuda, T; Manako, H; Yamashita, H; Yamamoto, T; Komiyama, S

    1999-01-01

    In 1953, Calvet first displayed the fundamental frequency (pitch) and sound pressure level (intensity) of a voice on a two-dimensional plane and created a voice range profile. This profile has been used to evaluate clinically various vocal disorders, although such evaluations to date have been subjective without quantitative assessment. In the present study, a quantitative system was developed to evaluate the voice range profile utilizing a personal computer. The area of the voice range profile was defined as the voice volume. This volume was analyzed in 137 males and 175 females who were treated for various dysphonias at Kyushu University between 1984 and 1990. Ten normal subjects served as controls. The voice volume in cases with voice disorders significantly decreased irrespective of the disease and sex. Furthermore, cases having better improvement after treatment showed a tendency for the voice volume to increase. These findings illustrated the voice volume as a useful clinical test for evaluating voice control in cases with vocal disorders.

  15. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  16. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  17. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  18. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and

  19. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  20. Risk in Enterprise Cloud Computing: Re-Evaluated

    ERIC Educational Resources Information Center

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  1. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    PubMed

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  2. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  3. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. © 2015 Blackwell Verlag GmbH.

  4. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  5. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  6. Quantitative framework for prospective motion correction evaluation.

    PubMed

    Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2016-02-01

    Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.

  7. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boquerón ...

  8. EVALUATING QUANTITATIVE FORMULAS FOR DOSE-RESPONSE ASSESSMENT OF CHEMICAL MIXTURES

    EPA Science Inventory

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment d...

  9. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  10. Quantitative evaluation research of glare from automotive headlamps

    NASA Astrophysics Data System (ADS)

    Wang, Tiecheng; Qian, Rui; Cao, Ye; Gao, Mingqiu

    2018-01-01

    This study concerns the quantized evaluation research of glare from automotive headlamps. In the actual regulations, only one point in the test screen is set for judging whether driver can bear the light caused by headlamps of opposing vehicle. To evaluating practical effect of glare, we accept a glare zone with the probability distribution information of the oncoming driver's eye position. In this focus area, glare level of headlamp is represented by weighted luminous flux. To confirm the most comfortable illuminance value to human eyes at 50 m, we used test point B50L as observation position, and collected 1,000 subjective evaluation data from 20 test personnel in different ages during two months. Basing on the assessment results, we calculated 0.60 lx as recommended value for standardized testing procedure at 25 m. Then we figured out 0.38 lm as optimum value, and 0.25 / 1.20 lm as limiting values depending on regulations. We tested 40 sample vehicles with different levels to verify the sectional nonlinear quantitative evaluation method we designed, and analyzed the typical test results.

  11. Visual and Quantitative Assessment of Coronary Stenoses at Angiography Versus Fractional Flow Reserve: The Impact of Risk Factors.

    PubMed

    Adjedj, Julien; Xaplanteris, Panagiotis; Toth, Gabor; Ferrara, Angela; Pellicano, Mariano; Ciccarelli, Giovanni; Floré, Vincent; Barbato, Emanuele; De Bruyne, Bernard

    2017-07-01

    The correlation between angiographic assessment of coronary stenoses and fractional flow reserve (FFR) is weak. Whether and how risk factors impact the diagnostic accuracy of angiography is unknown. We sought to evaluate the diagnostic accuracy of angiography by visual estimate and by quantitative coronary angiography when compared with FFR and evaluate the influence of risk factors (RF) on this accuracy. In 1382 coronary stenoses (1104 patients), percent diameter stenosis by visual estimation (DS VE ) and by quantitative coronary angiography (DS QCA ) was compared with FFR. Patients were divided into 4 subgroups, according to the presence of RFs, and the relationship between DS VE , DS QCA , and FFR was analyzed. Overall, DS VE was significantly higher than DS QCA ( P <0.0001); nonetheless, when examined by strata of DS, DS VE was significantly smaller than DS QCA in mild stenoses, although the reverse held true for severe stenoses. Compared with FFR, a large scatter was observed for both DS VE and DS QCA . When using a dichotomous FFR value of 0.80, C statistic was significantly higher for DS VE than for DS QCA (0.712 versus 0.640, respectively; P <0.001). C statistics for DS VE decreased progressively as RFs accumulated (0.776 for ≤1 RF, 0.750 for 2 RFs, 0.713 for 3 RFs and 0.627 for ≥4 RFs; P =0.0053). In addition, in diabetics, the relationship between FFR and angiographic indices was particularly weak (C statistics: 0.524 for DS VE and 0.511 for DS QCA ). Overall, DS VE has a better diagnostic accuracy than DS QCA to predict the functional significance of coronary stenosis. The predictive accuracy of angiography is moderate in patients with ≤1 RFs, but weakens as RFs accumulate, especially in diabetics. © 2017 American Heart Association, Inc.

  12. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  13. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  14. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  15. 75 FR 25239 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ... human health assessment program that evaluates quantitative and qualitative risk information on effects... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...

  16. Methods for the field evaluation of quantitative G6PD diagnostics: a review.

    PubMed

    Ley, Benedikt; Bancone, Germana; von Seidlein, Lorenz; Thriemer, Kamala; Richards, Jack S; Domingo, Gonzalo J; Price, Ric N

    2017-09-11

    Individuals with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk of severe haemolysis following the administration of 8-aminoquinoline compounds. Primaquine is the only widely available 8-aminoquinoline for the radical cure of Plasmodium vivax. Tafenoquine is under development with the potential to simplify treatment regimens, but point-of-care (PoC) tests will be needed to provide quantitative measurement of G6PD activity prior to its administration. There is currently a lack of appropriate G6PD PoC tests, but a number of new tests are in development and are likely to enter the market in the coming years. As these are implemented, they will need to be validated in field studies. This article outlines the technical details for the field evaluation of novel quantitative G6PD diagnostics such as sample handling, reference testing and statistical analysis. Field evaluation is based on the comparison of paired samples, including one sample tested by the new assay at point of care and one sample tested by the gold-standard reference method, UV spectrophotometry in an established laboratory. Samples can be collected as capillary or venous blood; the existing literature suggests that potential differences in capillary or venous blood are unlikely to affect results substantially. The collection and storage of samples is critical to ensure preservation of enzyme activity, it is recommended that samples are stored at 4 °C and testing occurs within 4 days of collection. Test results can be visually presented as scatter plot, Bland-Altman plot, and a histogram of the G6PD activity distribution of the study population. Calculating the adjusted male median allows categorizing results according to G6PD activity to calculate standard performance indicators and to perform receiver operating characteristic (ROC) analysis.

  17. A quantitative evaluation of the high elbow technique in front crawl.

    PubMed

    Suito, Hiroshi; Nunome, Hiroyuki; Ikegami, Yasuo

    2017-07-01

    Many coaches often instruct swimmers to keep the elbow in a high position (high elbow position) during early phase of the underwater stroke motion (pull phase) in front crawl, however, the high elbow position has never been quantitatively evaluated. The aims of this study were (1) to quantitatively evaluate the "high elbow" position, (2) to clarify the relationship between the high elbow position and required upper limb configuration and (3) to examine the efficacy of high elbow position on the resultant swimming velocity. Sixteen highly skilled and 6 novice male swimmers performed 25 m front crawl with maximal effort and their 3-dimensional arm stroke motion was captured at 60 Hz. An attempt was made to develop a new index to evaluate the high elbow position (I he : high elbow index) using 3-dimensional coordinates of the shoulder, elbow and wrist joints. I he of skilled swimmers moderately correlated with the average shoulder internal rotation angle (r = -0.652, P < 0.01) and swimming velocity (r = -0.683, P < 0.01) during the pull phase. These results indicate that I he is a useful index for evaluating high elbow arm stroke technique during the pull phase in front crawl.

  18. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  19. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    NASA Astrophysics Data System (ADS)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  20. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  1. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers--specific application to Listeria monocytogenes and ready-to-eat meat products.

    PubMed

    Mataragas, M; Zwietering, M H; Skandamis, P N; Drosinos, E H

    2010-07-31

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH=6.2-6.4 and water activity=0.98-0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combination. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  3. Quantitative evaluation of morphological changes in activated platelets in vitro using digital holographic microscopy.

    PubMed

    Kitamura, Yutaka; Isobe, Kazushige; Kawabata, Hideo; Tsujino, Tetsuhiro; Watanabe, Taisuke; Nakamura, Masayuki; Toyoda, Toshihisa; Okudera, Hajime; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2018-06-18

    Platelet activation and aggregation have been conventionally evaluated using an aggregometer. However, this method is suitable for short-term but not long-term quantitative evaluation of platelet aggregation, morphological changes, and/or adhesion to specific materials. The recently developed digital holographic microscopy (DHM) has enabled the quantitative evaluation of cell size and morphology without labeling or destruction. Thus, we aim to validate its applicability in quantitatively evaluating changes in cell morphology, especially in the aggregation and spreading of activated platelets, thus modifying typical image analysis procedures to suit aggregated platelets. Freshly prepared platelet-rich plasma was washed with phosphate-buffered saline and treated with 0.1% CaCl 2 . Platelets were then fixed and subjected to DHM, scanning electron microscopy (SEM), atomic force microscopy, optical microscopy, and flow cytometry (FCM). Tightly aggregated platelets were identified as single cells. Data obtained from time-course experiments were plotted two-dimensionally according to the average optical thickness versus attachment area and divided into four regions. The majority of the control platelets, which supposedly contained small and round platelets, were distributed in the lower left region. As activation time increased, however, this population dispersed toward the upper right region. The distribution shift demonstrated by DHM was essentially consistent with data obtained from SEM and FCM. Therefore, DHM was validated as a promising device for testing platelet function given that it allows for the quantitative evaluation of activation-dependent morphological changes in platelets. DHM technology will be applicable to the quality assurance of platelet concentrates, as well as diagnosis and drug discovery related to platelet functions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. [Quantitative risk model for verocytotoxigenic Escherichia coli cross-contamination during homemade hamburger preparation].

    PubMed

    Signorini, M L; Frizzo, L S

    2009-01-01

    The objective of this study was to develop a quantitative risk model for verocytotoxigenic Escherichia coil (VTEC) cross-contamination during hamburger preparation at home. Published scientific information about the disease was considered for the elaboration of the model, which included a number of routines performed during food preparation in kitchens. The associated probabilities of bacterial transference between food items and kitchen utensils which best described each stage of the process were incorporated into the model by using @Risk software. Handling raw meat before preparing ready-to-eat foods (Odds ratio, OR, 6.57), as well as hand (OR = 12.02) and cutting board (OR = 5.02) washing habits were the major risk factors of VTEC cross-contamination from meat to vegetables. The information provided by this model should be considered when designing public information campaigns on hemolytic uremic syndrome risk directed to food handlers, in order to stress the importance of the above mentioned factors in disease transmission.

  5. A quantitative risk assessment for the safety of carcase storage systems for scrapie infected farms.

    PubMed

    Adkin, A; Jones, D L; Eckford, R L; Edwards-Jones, G; Williams, A P

    2014-10-01

    To determine the risk associated with the use of carcase storage vessels on a scrapie infected farm. A stochastic quantitative risk assessment was developed to determine the rate of accumulation and fate of scrapie in a novel low-input storage system. For an example farm infected with classical scrapie, a mean of 10(3·6) Ovine Oral ID50 s was estimated to accumulate annually. Research indicates that the degradation of any prions present may range from insignificant to a magnitude of one or two logs over several months of storage. For infected farms, the likely partitioning of remaining prion into the sludge phase would necessitate the safe operation and removal of resulting materials from these systems. If complete mixing could be assumed, on average, the concentrations of infectivity are estimated to be slightly lower than that measured in placenta from infected sheep at lambing. This is the first quantitative assessment of the scrapie risk associated with fallen stock on farm and provides guidance to policy makers on the safety of one type of storage system and the relative risk when compared to other materials present on an infected farm. © 2014 Crown Copyright. Journal of Applied Microbiology © 2014 Society for Applied Microbiology This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  6. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  7. Estimation and evaluation of management options to control and/or reduce the risk of not complying with commercial sterility.

    PubMed

    Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2015-11-20

    In a previous study, a modular process risk model, from the raw material reception to the final product storage, was built to estimate the risk of a UHT-aseptic line of not complying with commercial sterility (Pujol et al., 2015). This present study was focused on demonstrating how the model (updated version with uncertainty and variability separated and 2(nd) order Monte Carlo procedure run) could be used to assess quantitatively the influence of management options. This assessment was done in three steps: pinpoint which process step had the highest influence on the risk, identify which management option(s) could be the most effective to control and/or reduce the risk, and finally evaluate quantitatively the influence of changing process setting(s) on the risk. For Bacillus cereus, it was identified that during post-process storage in an aseptic tank, there was potentially an air re-contamination due to filter efficiency loss (efficiency loss due to successive in-place sterilizations after cleaning operations), followed by B. cereus growth. Two options were then evaluated: i) reducing by one fifth of the number of filter sterilizations before renewing the filters, ii) designing new UHT-aseptic lines without an aseptic tank, i.e. without a storage period after the thermal process and before filling. Considering the uncertainty in the model, it was not possible to confirm whether these options had a significant influence on the risk associated with B. cereus. On the other hand, for Geobacillus stearothermophilus, combinations of heat-treatment time and temperature enabling the control or reduction in risk by a factor of ca. 100 were determined; for ease of operational implementation, they were presented graphically in the form of iso-risk curves. For instance, it was established that a heat treatment of 138°C for 31s (instead of 138°C for 25s) enabled a reduction in risk to 18×10(-8) (95% CI=[10; 34]×10(-8)), instead of 578×10(-8) (95% CI=[429; 754]×10

  8. Evaluation of Projected Agricultural Climate Risk over the Contiguous US

    NASA Astrophysics Data System (ADS)

    Zhu, X.; Troy, T. J.; Devineni, N.

    2017-12-01

    Food demands are rising due to an increasing population with changing food preferences, which places pressure on agricultural production. Additionally, climate extremes have recently highlighted the vulnerability of our agricultural system to climate variability. This study seeks to fill two important gaps in current knowledge: how does the widespread response of irrigated crops differ from rainfed and how can we best account for uncertainty in yield responses. We developed a stochastic approach to evaluate climate risk quantitatively to better understand the historical impacts of climate change and estimate the future impacts it may bring about to agricultural system. Our model consists of Bayesian regression, distribution fitting, and Monte Carlo simulation to simulate rainfed and irrigated crop yields at the US county level. The model was fit using historical data for 1970-2010 and was then applied over different climate regions in the contiguous US using the CMIP5 climate projections. The relative importance of many major growing season climate indices, such as consecutive dry days without rainfall or heavy precipitation, was evaluated to determine what climate indices play a role in affecting future crop yields. The statistical modeling framework also evaluated the impact of irrigation by using county-level irrigated and rainfed yields separately. Furthermore, the projected years with negative yield anomalies were specifically evaluated in terms of magnitude, trend and potential climate drivers. This framework provides estimates of the agricultural climate risk for the 21st century that account for the full uncertainty of climate occurrences, range of crop response, and spatial correlation in climate. The results of this study can contribute to decision making about crop choice and water use in an uncertain future climate.

  9. [Biological evaluation within a risk management process].

    PubMed

    Zhuang, Fei; Ding, Biao

    2007-07-01

    Bio-evaluation within the medical device quality/risk management system is a risk analyzing and assessing process. On the basis of data from characterization of materials, scientific literatures, application history, bio-toxicology testing and so on, weighing the benefit and the risk, bio-evaluation does a conclusion to "take" or "quit" the product design. There is no "zero risk" though "no toxicity" always is the most desirable conclusion in a testing report. The application history data is the most comprehensive among the information available, since no testing system can "clone" the human body. In addition, the capital cost has to be taken into account when bringing the sophisticated testing technologies into the evaluating system. Investigating the #G95-1 of FDA CDRH and the changes of ISO 10993-1, the trend to integrate bio-evaluation into a quality/risk management process can be figured out.

  10. Characterizing the risk of infection from Mycobacterium tuberculosis in commercial passenger aircraft using quantitative microbial risk assessment.

    PubMed

    Jones, Rachael M; Masago, Yoshifumi; Bartrand, Timothy; Haas, Charles N; Nicas, Mark; Rose, Joan B

    2009-03-01

    Quantitative microbial risk assessment was used to predict the likelihood and spatial organization of Mycobacterium tuberculosis (Mtb) transmission in a commercial aircraft. Passenger exposure was predicted via a multizone Markov model in four scenarios: seated or moving infectious passengers and with or without filtration of recirculated cabin air. The traditional exponential (k = 1) and a new exponential (k = 0.0218) dose-response function were used to compute infection risk. Emission variability was included by Monte Carlo simulation. Infection risks were higher nearer and aft of the source; steady state airborne concentration levels were not attained. Expected incidence was low to moderate, with the central 95% ranging from 10(-6) to 10(-1) per 169 passengers in the four scenarios. Emission rates used were low compared to measurements from active TB patients in wards, thus a "superspreader" emitting 44 quanta/h could produce 6.2 cases or more under these scenarios. Use of respiratory protection by the infectious source and/or susceptible passengers reduced infection incidence up to one order of magnitude.

  11. Evaluation of self-combustion risk in tire derived aggregate fills.

    PubMed

    Arroyo, Marcos; San Martin, Ignacio; Olivella, Sebastian; Saaltink, Maarten W

    2011-01-01

    Lightweight tire derived aggregate (TDA) fills are a proven recycling outlet for waste tires, requiring relatively low cost waste processing and being competitively priced against other lightweight fill alternatives. However its value has been marred as several TDA fills have self-combusted during the early applications of this technique. An empirical review of these cases led to prescriptive guidelines from the ASTM aimed at avoiding this problem. This approach has been successful in avoiding further incidents of self-combustion. However, at present there remains no rational method available to quantify self-combustion risk in TDA fills. This means that it is not clear which aspects of the ASTM guidelines are essential and which are accessory. This hinders the practical use of TDA fills despite their inherent advantages as lightweight fill. Here a quantitative approach to self-combustion risk evaluation is developed and illustrated with a parametric analysis of an embankment case. This is later particularized to model a reported field self-combustion case. The approach is based on the available experimental observations and incorporates well-tested methodological (ISO corrosion evaluation) and theoretical tools (finite element analysis of coupled heat and mass flow). The results obtained offer clear insights into the critical aspects of the problem, allowing already some meaningful recommendations for guideline revision. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Quantitative risk assessment using empirical vulnerability functions from debris flow event reconstruction

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami

    2010-05-01

    For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of

  13. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  14. [Quantitative risk assessment of the polycyclic aromatic hydrocarbons dietary exposure from edible fats and oils in China].

    PubMed

    Cao, Mengsi; Wang, Jun; Zhang, Lishi; Yan, Weixing

    2016-02-01

    To assess the quantitative risk of the polycyclic aromatic hydrocarbons (PAHs) dietary exposure from edible fats and oils in China. One hundred samples of edible fats and oils were collected from the supermarkets and the farmers markets in 11 provinces of China from December in 2013 to May in 2014. Then they were tested for EU15+1 PAHs (16 PAHs were controlled in priority by European Food Safety Authority) by two test methods which were QuECHERS-GC-MS-MS and GPC-HPLC-FLD. Data of PAHs concentration and edible fats and oils consumption which were from Chinese National Nutrition and Health Survey in 2002 were combined to evaluate carcinogenic risk of PAHs in edible fats and oils by the method of margin of exposure (MOE). In this process, we divided the population into 6 groups, namely male adults (older than 18 years old), female adults (older than 18), male youths (13-17), female youths (13-17), school-agers (6-12) and preschoolers (2-5), and thought carcinogenicity as the critical toxicity end point of PAHs. Two quantitative risk assessment methods, i.e. point assessment and probability assessment, were used to evaluate the dietary exposure and MOEs. EU15+1 PAHs in one of 100 samples were not detected, other samples were polluted in different degrees; the detection rates were 3%-98% and the average contents were 0.26-3.26 μg/kg. The results of PAHs dietary exposure from both of point assessment and probability assessment were the same. The average exposures of PAH8 were as the following: male adults were 10.03 and (9.34 ± 12.61) ng·kg(-1)·d(-1)(The former was from point assessment and the latter from probability assessment, the same below), female adults were 9.95 and (9.60 ± 15.04) ng · kg(-1)·d (-1), male youths were 11.09 and (10.84 ± 16.54) ng·kg(-1)·d(-1), female youths were 10.06 and (9.58 ± 12.87) ng·kg(-1)·d(-1),school-agers were 15.29 and (15.62 ± 25.54) ng·kg(-1)·d(-1), preschoolers were 19.27 and (19.22 ± 28.91) ng·kg(-1)·d(-1). MOEs

  15. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  16. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  17. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    USDA-ARS?s Scientific Manuscript database

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  18. Quantitative evaluation of the CEEM soil sampling intercomparison.

    PubMed

    Wagner, G; Lischer, P; Theocharopoulos, S; Muntau, H; Desaules, A; Quevauviller, P

    2001-01-08

    The aim of the CEEM soil project was to compare and to test the soil sampling and sample preparation guidelines used in the member states of the European Union and Switzerland for investigations of background and large-scale contamination of soils, soil monitoring and environmental risk assessments. The results of the comparative evaluation of the sampling guidelines demonstrated that, in soil contamination studies carried out with different sampling strategies and methods, comparable results can hardly be expected. Therefore, a reference database (RDB) was established by the organisers, which acted as a basis for the quantitative comparison of the participants' results. The detected deviations were related to the methodological details of the individual strategies. The comparative evaluation concept consisted of three steps: The first step was a comparison of the participants' samples (which were both centrally and individually analysed) between each other, as well as with the reference data base (RDB) and some given soil quality standards on the level of concentrations present. The comparison was made using the example of the metals cadmium, copper, lead and zinc. As a second step, the absolute and relative deviations between the reference database and the participants' results (both centrally analysed under repeatability conditions) were calculated. The comparability of the samples with the RDB was categorised on four levels. Methods of exploratory statistical analysis were applied to estimate the differential method bias among the participants. The levels of error caused by sampling and sample preparation were compared with those caused by the analytical procedures. As a third step, the methodological profiles of the participants were compiled to concisely describe the different procedures used. They were related to the results to find out the main factors leading to their incomparability. The outcome of this evaluation process was a list of strategies and

  19. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  20. Quantitative nondestructive evaluation of ceramic matrix composite by the resonance method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watanabe, T.; Aizawa, T.; Kihara, J.

    The resonance method was developed to make quantitative nondestructive evaluation on the mechanical properties without any troublesome procedure. Since the present method is indifferent to the geometry of specimen, both monolithic and ceramic matrix composite materials in process can be evaluated in the nondestructive manner. Al{sub 2}O{sub 3}, Si{sub 3}N{sub 4}, SiC/Si{sub 3}N{sub 4}, and various C/C composite materials are employed to demonstrate the validity and effectiveness of the present method.

  1. The effects of infographics and several quantitative versus qualitative formats for cardiovascular disease risk, including heart age, on people's risk understanding.

    PubMed

    Damman, Olga C; Vonk, Suzanne I; van den Haak, Maaike J; van Hooijdonk, Charlotte M J; Timmermans, Danielle R M

    2018-03-11

    To study how comprehension of cardiovascular disease (CVD) risk is influenced by: (1) infographics about qualitative risk information, with/without risk numbers; (2) which qualitative risk dimension is emphasized; (3) heart age vs. traditional risk format. For aim 1, a 2 (infographics versus text) x 2 (risk number versus no risk number) between-subjects design was used. For aim 2, three pieces of information were tested within-subjects. Aim 3 used a simple comparison group. Participants (45-65 yrs old) were recruited through an online access panel; low educated people were oversampled. They received hypothetical risk information (20%/61yrs). Primary outcomes: recall, risk appraisals, subjective/objective risk comprehension. behavioral intentions, information evaluations. Infographics of qualitative risk dimensions negatively affected recall, subjective risk comprehension and information evaluations. No effect of type of risk dimension was found on risk perception. Heart age influenced recall, comprehension, evaluations and affective risk appraisals. Infographics of hypothetical CVD risk information had detrimental effects on measures related to risk perception/comprehension, but effects were mainly seen in undereducated participants. Heart age influenced perceptions/comprehension of hypothetical risk in a way that seemed to support understanding. Heart age seems a fruitful risk communication approach in disease risk calculators. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  3. Qualitative and quantitative evaluation of avian demineralized bone matrix in heterotopic beds.

    PubMed

    Reza Sanaei, M; Abu, Jalila; Nazari, Mojgan; A B, Mohd Zuki; Allaudin, Zeenathul N

    2013-11-01

    To evaluate the osteogenic potential of avian demineralized bone matrix (DBM) in the context of implant geometry. Experimental. Rock pigeons (n = 24). Tubular and chipped forms of DBM were prepared by acid demineralization of long bones from healthy allogeneic donors and implanted bilaterally into the pectoral region of 24 pigeons. After euthanasia at 1, 4, 6, 8, 10, and 12 weeks, explants were evaluated histologically and compared by means of quantitative (bone area) and semi quantitative measures (scores). All explants had new bone at retrieval with the exception of tubular implants at the end of week 1. The most reactive part in both implants was the interior region between the periosteal and endosteal surfaces followed by the area at the implant-muscle interface. Quantitative measurements demonstrated a significantly (P = .012) greater percentage of new bone formation induced by tubular implants (80.28 ± 8.94) compared with chip implants (57.64 ± 3.12). There was minimal inflammation. Avian DBM initiates heterotopic bone formation in allogeneic recipients with low grades of immunogenicity. Implant geometry affects this phenomenon as osteoconduction appeared to augment the magnitude of the effects in larger tubular implants. © Copyright 2013 by The American College of Veterinary Surgeons.

  4. Application of Quantitative Microbial Risk Assessment to analyze the public health risk from poor drinking water quality in a low income area in Accra, Ghana.

    PubMed

    Machdar, E; van der Steen, N P; Raschid-Sally, L; Lens, P N L

    2013-04-01

    In Accra, Ghana, a majority of inhabitants lives in over-crowded areas with limited access to piped water supply, which is often also intermittent. This study assessed in a densely populated area the risk from microbial contamination of various sources of drinking water, by conducting a Quantitative Microbiological Risk Assessment (QMRA) to estimate the risk to human health from microorganism exposure and dose-response relationships. Furthermore the cost-effectiveness in reducing the disease burden through targeted interventions was evaluated. Five risk pathways for drinking water were identified through a survey (110 families), namely household storage, private yard taps, communal taps, communal wells and water sachets. Samples from each source were analyzed for Escherichia coli and Ascaris contamination. Published ratios between E. coli and other pathogens were used for the QMRA and disease burden calculations. The major part of the burden of disease originated from E. coli O157:H7 (78%) and the least important contributor was Cryptosporidium (0.01%). Other pathogens contributed 16% (Campylobacter), 5% (Rotavirus) and 0.3% (Ascaris). The sum of the disease burden of these pathogens was 0.5 DALYs per person per year, which is much higher than the WHO reference level. The major contamination pathway was found to be household storage. Disinfection of water at household level was the most cost-effective intervention (<5 USD/DALY-averted) together with hygiene education. Water supply network improvements were significantly less cost-effective. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of

  6. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  7. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    PubMed

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk.

  8. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  9. Improved cancer risk stratification and diagnosis via quantitative phase microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Uttam, Shikhar; Pham, Hoa V.; Hartman, Douglas J.

    2017-02-01

    Pathology remains the gold standard for cancer diagnosis and in some cases prognosis, in which trained pathologists examine abnormality in tissue architecture and cell morphology characteristic of cancer cells with a bright-field microscope. The limited resolution of conventional microscope can result in intra-observer variation, missed early-stage cancers, and indeterminate cases that often result in unnecessary invasive procedures in the absence of cancer. Assessment of nanoscale structural characteristics via quantitative phase represents a promising strategy for identifying pre-cancerous or cancerous cells, due to its nanoscale sensitivity to optical path length, simple sample preparation (i.e., label-free) and low cost. I will present the development of quantitative phase microscopy system in transmission and reflection configuration to detect the structural changes in nuclear architecture, not be easily identifiable by conventional pathology. Specifically, we will present the use of transmission-mode quantitative phase imaging to improve diagnostic accuracy of urine cytology and the nuclear dry mass is progressively correlate with negative, atypical, suspicious and positive cytological diagnosis. In a second application, we will present the use of reflection-mode quantitative phase microscopy for depth-resolved nanoscale nuclear architecture mapping (nanoNAM) of clinically prepared formalin-fixed, paraffin-embedded tissue sections. We demonstrated that the quantitative phase microscopy system detects a gradual increase in the density alteration of nuclear architecture during malignant transformation in animal models of colon carcinogenesis and in human patients with ulcerative colitis, even in tissue that appears histologically normal according to pathologists. We evaluated the ability of nanoNAM to predict "future" cancer progression in patients with ulcerative colitis.

  10. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  11. EQUIFAT: A novel scoring system for the semi-quantitative evaluation of regional adipose tissues in Equidae.

    PubMed

    Morrison, Philippa K; Harris, Patricia A; Maltin, Charlotte A; Grove-White, Dai; Argo, Caroline McG

    2017-01-01

    Anatomically distinct adipose tissues represent variable risks to metabolic health in man and some other mammals. Quantitative-imaging of internal adipose depots is problematic in large animals and associations between regional adiposity and health are poorly understood. This study aimed to develop and test a semi-quantitative system (EQUIFAT) which could be applied to regional adipose tissues. Anatomically-defined, photographic images of adipose depots (omental, mesenteric, epicardial, rump) were collected from 38 animals immediately post-mortem. Images were ranked and depot-specific descriptors were developed (1 = no fat visible; 5 = excessive fat present). Nuchal-crest and ventro-abdominal-retroperitoneal adipose depot depths (cm) were transformed to categorical 5 point scores. The repeatability and reliability of EQUIFAT was independently tested by 24 observers. When half scores were permitted, inter-observer agreement was substantial (average κw: mesenteric, 0.79; omental, 0.79; rump 0.61) or moderate (average κw; epicardial, 0.60). Intra-observer repeatability was tested by 8 observers on 2 occasions. Kappa analysis indicated perfect (omental and mesenteric) and substantial agreement (epicardial and rump) between attempts. A further 207 animals were evaluated ante-mortem (age, height, breed-type, gender, body condition score [BCS]) and again immediately post-mortem (EQUIFAT scores, carcass weight). Multivariable, random effect linear regression models were fitted (breed as random effect; BCS as outcome variable). Only height, carcass weight, omental and retroperitoneal EQUIFAT scores remained as explanatory variables in the final model. The EQUIFAT scores developed here demonstrate clear functional differences between regional adipose depots and future studies could be directed towards describing associations between adiposity and disease risk in surgical and post-mortem situations.

  12. EQUIFAT: A novel scoring system for the semi-quantitative evaluation of regional adipose tissues in Equidae

    PubMed Central

    Morrison, Philippa K.; Harris, Patricia A.; Maltin, Charlotte A.; Grove-White, Dai; Argo, Caroline McG.

    2017-01-01

    Anatomically distinct adipose tissues represent variable risks to metabolic health in man and some other mammals. Quantitative-imaging of internal adipose depots is problematic in large animals and associations between regional adiposity and health are poorly understood. This study aimed to develop and test a semi-quantitative system (EQUIFAT) which could be applied to regional adipose tissues. Anatomically-defined, photographic images of adipose depots (omental, mesenteric, epicardial, rump) were collected from 38 animals immediately post-mortem. Images were ranked and depot-specific descriptors were developed (1 = no fat visible; 5 = excessive fat present). Nuchal-crest and ventro-abdominal-retroperitoneal adipose depot depths (cm) were transformed to categorical 5 point scores. The repeatability and reliability of EQUIFAT was independently tested by 24 observers. When half scores were permitted, inter-observer agreement was substantial (average κw: mesenteric, 0.79; omental, 0.79; rump 0.61) or moderate (average κw; epicardial, 0.60). Intra-observer repeatability was tested by 8 observers on 2 occasions. Kappa analysis indicated perfect (omental and mesenteric) and substantial agreement (epicardial and rump) between attempts. A further 207 animals were evaluated ante-mortem (age, height, breed-type, gender, body condition score [BCS]) and again immediately post-mortem (EQUIFAT scores, carcass weight). Multivariable, random effect linear regression models were fitted (breed as random effect; BCS as outcome variable). Only height, carcass weight, omental and retroperitoneal EQUIFAT scores remained as explanatory variables in the final model. The EQUIFAT scores developed here demonstrate clear functional differences between regional adipose depots and future studies could be directed towards describing associations between adiposity and disease risk in surgical and post-mortem situations. PMID:28296956

  13. Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.

    PubMed

    Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R

    2014-08-01

    To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.

  14. Quantitative risk assessment for a glass fiber insulation product.

    PubMed

    Fayerweather, W E; Bender, J R; Hadley, J G; Eastes, W

    1997-04-01

    California Proposition 65 (Prop65) provides a mechanism by which the manufacturer may perform a quantitative risk assessment to be used in determining the need for cancer warning labels. This paper presents a risk assessment under this regulation for professional and do-it-yourself insulation installers. It determines the level of insulation glass fiber exposure (specifically Owens Corning's R-25 PinkPlus with Miraflex) that, assuming a working lifetime exposure, poses no significant cancer risk under Prop65's regulations. "No significant risk" is defined under Prop65 as a lifetime risk of no more than one additional cancer case per 100,000 exposed persons, and nonsignificant exposure is defined as a working lifetime exposure associated with "no significant risk." This determination can be carried out despite the fact that the relevant underlying studies (i.e., chronic inhalation bioassays) of comparable glass wool fibers do not show tumorigenic activity. Nonsignificant exposures are estimated from (1) the most recent RCC chronic inhalation bioassay of nondurable fiberglass in rats; (2) intraperitoneal fiberglass injection studies in rats; (3) a distributional, decision analysis approach applied to four chronic inhalation rat bioassays of conventional fiberglass; (4) an extrapolation from the RCC chronic rat inhalation bioassay of durable refractory ceramic fibers; and (5) an extrapolation from the IOM chronic rat inhalation bioassay of durable E glass microfibers. When the EPA linear nonthreshold model is used, central estimates of nonsignificant exposure range from 0.36 fibers/cc (for the RCC chronic inhalation bioassay of fiberglass) through 21 fibers/cc (for the i.p. fiberglass injection studies). Lower 95% confidence bounds on these estimates vary from 0.17 fibers/cc through 13 fibers/cc. Estimates derived from the distributional approach or from applying the EPA linear nonthreshold model to chronic bioassays of durable fibers such as refractory ceramic fiber

  15. 75 FR 76982 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...), Office of Research and Development, U.S. Environmental Protection Agency, Washington, DC 20460; telephone... human health assessment program that evaluates quantitative and qualitative risk information on effects...

  16. Agreement between quantitative microbial risk assessment and epidemiology at low doses during waterborne outbreaks of protozoan disease

    USDA-ARS?s Scientific Manuscript database

    Quantitative microbial risk assessment (QMRA) is a valuable complement to epidemiology for understanding the health impacts of waterborne pathogens. The approach works by extrapolating available data in two ways. First, dose-response data are typically extrapolated from feeding studies, which use ...

  17. Risk management in technovigilance: construction and validation of a medical-hospital product evaluation instrument.

    PubMed

    Kuwabara, Cleuza Catsue Takeda; Evora, Yolanda Dora Martinez; de Oliveira, Márcio Mattos Borges

    2010-01-01

    With the continuous incorporation of health technologies, hospital risk management should be implemented to systemize the monitoring of adverse effects, performing actions to control and eliminate their damage. As part of these actions, Technovigilance is active in the procedures of acquisition, use and quality control of health products and equipment. This study aimed to construct and validate an instrument to evaluate medical-hospital products. This is a quantitative, exploratory, longitudinal and methodological development study, based on the Six Sigma quality management model, which has as its principle basis the component stages of the DMAIC Cycle. For data collection and content validation, the Delphi technique was used with professionals from the Brazilian Sentinel Hospital Network. It was concluded that the instrument developed permitted the evaluation of the product, differentiating between the results of the tested brands, in line with the initial study goal of qualifying the evaluations performed.

  18. Fluorescent proteins for quantitative microscopy: important properties and practical evaluation.

    PubMed

    Shaner, Nathan Christopher

    2014-01-01

    More than 20 years after their discovery, fluorescent proteins (FPs) continue to be the subject of massive engineering efforts yielding continued improvements. Among these efforts are many aspects that should be of great interest to quantitative imaging users. With new variants frequently introduced into the research community, "tried and true" FPs that have been relied on for many years may now be due for upgrades to more modern variants. However, the dizzying array of FPs now available can make the initial act of narrowing down the potential choices an intimidating prospect. This chapter describes the FP properties that most strongly impact their performance in quantitative imaging experiments, along with their physical origins as they are currently understood. A workflow for evaluating a given FP in the researcher's chosen experimental system (e.g., a specific cell line) is described. © 2014 Elsevier Inc. All rights reserved.

  19. Quantitative landslide risk assessment and mapping on the basis of recent occurrences

    NASA Astrophysics Data System (ADS)

    Remondo, Juan; Bonachea, Jaime; Cendrero, Antonio

    A quantitative procedure for mapping landslide risk is developed from considerations of hazard, vulnerability and valuation of exposed elements. The approach based on former work by the authors, is applied in the Bajo Deba area (northern Spain) where a detailed study of landslide occurrence and damage in the recent past (last 50 years) was carried out. Analyses and mapping are implemented in a Geographic Information System (GIS). The method is based on a susceptibility model developed previously from statistical relationships between past landslides and terrain parameters related to instability. Extrapolations based on past landslide behaviour were used to calculate failure frequency for the next 50 years. A detailed inventory of direct damage due to landslides during the study period was carried out and the main elements at risk in the area identified and mapped. Past direct (monetary) losses per type of element were estimated and expressed as an average 'specific loss' for events of a given magnitude (corresponding to a specified scenario). Vulnerability was assessed by comparing losses with the actual value of the elements affected and expressed as a fraction of that value (0-1). From hazard, vulnerability and monetary value, risk was computed for each element considered. Direct risk maps (€/pixel/year) were obtained and indirect losses from the disruption of economic activities due to landslides assessed. The final result is a risk map and table combining all losses per pixel for a 50-year period. Total monetary value at risk for the Bajo Deba area in the next 50 years is about 2.4 × 10 6 Euros.

  20. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. 77 FR 20817 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-06

    ... quantitative and qualitative risk information on effects that may result from exposure to specific chemical... Deputy Director, National Center for Environmental Assessment, (mail code: 8601D), Office of Research and... program that evaluates quantitative and qualitative risk information on effects that may result from...

  2. 77 FR 41784 - Integrated Risk Information System (IRIS); Announcement of Availability of Literature Searches...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-16

    ... health assessment program that evaluates quantitative and qualitative risk information on effects that..., National Center for Environmental Assessment, (mail code: 8601P), Office of Research and Development, U.S... quantitative and qualitative risk information on effects that may result from exposure to specific chemical...

  3. Determining quantitative immunophenotypes and evaluating their implications

    NASA Astrophysics Data System (ADS)

    Redelman, Douglas; Hudig, Dorothy; Berner, Dave; Castell, Linda M.; Roberts, Don; Ensign, Wayne

    2002-05-01

    Quantitative immunophenotypes varied widely among > 100 healthy young males but were maintained at characteristic levels within individuals. The initial results (SPIE Proceedings 4260:226) that examined cell numbers and the quantitative expression of adhesion and lineage-specific molecules, e.g., CD2 and CD14, have now been confirmed and extended to include the quantitative expression of inducible molecules such as HLA-DR and perforin (Pf). Some properties, such as the ratio of T helper (Th) to T cytotoxic/suppressor (Tc/s) cells, are known to be genetically determined. Other properties, e.g., the T:B cell ratio, the amount of CD19 per B cell, etc., behaved similarly and may also be inherited traits. Since some patterns observed in these healthy individuals resembled those found in pathological situations we tested whether the patterns could be associated with the occurrence of disease. The current studies shows that there were associations between quantitative immunophenotypes and the subsequent incidence and severity of disease. For example, individuals with characteristically low levels of HLA-DR or B cells or reduced numbers of Pf+ Tc/s cells had more frequent and/or more severe upper respiratory infections. Quantitative immunophenotypes will be more widely measured if the necessary standards are available and if appropriate procedures are made more accessible.

  4. Quantitative systems toxicology

    PubMed Central

    Bloomingdale, Peter; Housand, Conrad; Apgar, Joshua F.; Millard, Bjorn L.; Mager, Donald E.; Burke, John M.; Shah, Dhaval K.

    2017-01-01

    The overarching goal of modern drug development is to optimize therapeutic benefits while minimizing adverse effects. However, inadequate efficacy and safety concerns remain to be the major causes of drug attrition in clinical development. For the past 80 years, toxicity testing has consisted of evaluating the adverse effects of drugs in animals to predict human health risks. The U.S. Environmental Protection Agency recognized the need to develop innovative toxicity testing strategies and asked the National Research Council to develop a long-range vision and strategy for toxicity testing in the 21st century. The vision aims to reduce the use of animals and drug development costs through the integration of computational modeling and in vitro experimental methods that evaluates the perturbation of toxicity-related pathways. Towards this vision, collaborative quantitative systems pharmacology and toxicology modeling endeavors (QSP/QST) have been initiated amongst numerous organizations worldwide. In this article, we discuss how quantitative structure-activity relationship (QSAR), network-based, and pharmacokinetic/pharmacodynamic modeling approaches can be integrated into the framework of QST models. Additionally, we review the application of QST models to predict cardiotoxicity and hepatotoxicity of drugs throughout their development. Cell and organ specific QST models are likely to become an essential component of modern toxicity testing, and provides a solid foundation towards determining individualized therapeutic windows to improve patient safety. PMID:29308440

  5. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  6. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  7. Evaluation of end-user satisfaction among employees participating in a web-based health risk assessment with tailored feedback.

    PubMed

    Vosbergen, Sandra; Laan, Eva K; Colkesen, Ersen B; Niessen, Maurice A J; Kraaijenhagen, Roderik A; Essink-Bot, Marie-Louise; Peek, Niels

    2012-10-30

    Web technology is increasingly being used to provide individuals with health risk assessments (HRAs) with tailored feedback. End-user satisfaction is an important determinant of the potential impact of HRAs, as this influences program attrition and adherence to behavioral advice. The aim of this study was to evaluate end-user satisfaction with a web-based HRA with tailored feedback applied in worksite settings, using mixed (quantitative and qualitative) methods. Employees of seven companies in the Netherlands participated in a commercial, web-based, HRA with tailored feedback. The HRA consisted of four components: 1) a health and lifestyle assessment questionnaire, 2) a biometric evaluation, 3) a laboratory evaluation, and 4) tailored feedback consisting of a personal health risk profile and lifestyle behavior advice communicated through a web portal. HRA respondents received an evaluation questionnaire after six weeks. Satisfaction with different parts of the HRA was measured on 5-point Likert scales. A free-text field provided the opportunity to make additional comments. In total, 2289 employees participated in the HRA program, of which 637 (27.8%) completed the evaluation questionnaire. Quantitative analysis showed that 85.6% of the respondents evaluated the overall HRA positively. The free-text field was filled in by 29.7 % of the respondents (189 out of 637), who made 315 separate remarks. Qualitative evaluation of these data showed that these respondents made critical remarks. Respondents felt restricted by the answer categories of the health and lifestyle assessment questionnaire, which resulted in the feeling that the corresponding feedback could be inadequate. Some respondents perceived the personal risk profile as unnecessarily alarming or suggested providing more explanations, reference values, and a justification of the behavioral advice given. Respondents also requested the opportunity to discuss the feedback with a health professional. Most people

  8. Establishment of a Quantitative Medical Technology Evaluation System and Indicators within Medical Institutions.

    PubMed

    Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji

    2018-06-05

    The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the

  9. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  10. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  11. At-Risk Youth Appearance and Job Performance Evaluation

    ERIC Educational Resources Information Center

    Freeburg, Beth Winfrey; Workman, Jane E.

    2008-01-01

    The goal of this study was to identify the relationship of at-risk youth workplace appearance to other job performance criteria. Employers (n = 30; each employing from 1 to 17 youths) evaluated 178 at-risk high school youths who completed a paid summer employment experience. Appearance evaluations were significantly correlated with evaluations of…

  12. How is an Electronic Screening and Brief Intervention Tool on Alcohol Use Received in a Student Population? A Qualitative and Quantitative Evaluation

    PubMed Central

    Van Royen, Paul; Vriesacker, Bart; De Mey, Leen; Van Hal, Guido

    2012-01-01

    Background A previous study among Antwerp college and university students showed that more male (10.2%–11.1%) than female (1.8%–6.2%) students are at risk for problematic alcohol use. The current literature shows promising results in terms of feasibility and effectiveness for the use of brief electronic interventions to address this health problem in college and university students. We evaluated this type of intervention and cite existing literature on the topic. Objective To develop a website, www.eentjeteveel.be, to motivate college and university students with problematic alcohol use to reduce alcohol consumption and increase their willingness to seek help. Method The website contained a questionnaire (Alcohol Use Disorders Identification Test [AUDIT]) for students to test their alcohol use. According to their answers, the students immediately received personalized feedback (personal AUDIT score and additional information on risks associated with alcohol use) and a suggestion for further action. Afterward, students could send an email to a student counselor for questions, guidance, or advice. To obtain in-depth qualitative information on the opinions and experiences of students, we held 5 focus group discussions. The topics were publicity, experiences, impressions, and effects of the website. We analyzed the quantitative results of the online test in SPSS 15.0. Results More than 3500 students visited www.eentjeteveel.be; over half were men (55.0%). A total of 34 students participated in the focus group discussions. The mixture of quantitative and qualitative methods to evaluate the intervention allowed a thorough analysis and provided complementary results. The intervention was well received by the student population. However, some minor aspects should be reconsidered, such as website publicity and providing students with options that were added after intermediate evaluation. The intervention increased the motivation of students to think about their alcohol

  13. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... on the sources of L. monocytogenes contamination, the effects of individual manufacturing and/or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1182] Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

  14. Human-Associated Fecal Quantitative Polymerase Chain ReactionMeasurements and Simulated Risk of Gastrointestinal Illness in Recreational Waters Contaminated with Raw Sewage

    EPA Science Inventory

    We used quantitative microbial risk assessment (QMRA) to estimate the risk of gastrointestinal (GI) illness associated with swimming in recreational waters containing different concentrations of human-associated fecal qPCR markers from raw sewage– HF183 and HumM2. The volume/volu...

  15. A novel approach for evaluating the risk of health care failure modes.

    PubMed

    Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang

    2012-12-01

    Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.

  16. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  17. Meeting the Needs of USGS's Science Application for Risk Reduction Group through Evaluation Research

    NASA Astrophysics Data System (ADS)

    Ritchie, L.; Campbell, N. M.; Vickery, J.; Madera, A.

    2016-12-01

    The U.S. Geological Survey's (USGS) Science Application for Risk Reduction (SAFRR) group aims to support innovative collaborations in hazard science by uniting a broad range of stakeholders to produce and disseminate knowledge in ways that are useful for decision-making in hazard mitigation, planning, and preparedness. Since 2013, an evaluation team at the Natural Hazards Center (NHC) has worked closely with the SAFRR group to assess these collaborations and communication efforts. In contributing to the nexus between academia and practice, or "pracademia," we use evaluation research to provide the USGS with useful feedback for crafting relevant information for practitioners and decision-makers. This presentation will highlight how the NHC team has varied our methodological and information design approaches according to the needs of each project, which in turn assist the SAFRR group in meeting the needs of practitioners and decision-makers. As the foci of our evaluation activities with SAFRR have evolved, so have our efforts to ensure that our work appropriately matches the information needs of each scenario project. We draw upon multiple projects, including evaluation work on the SAFRR Tsunami Scenario, "The First Sue Nami" tsunami awareness messaging, and their most recent project concerning a hypothetical M7 earthquake on the Hayward fault in the Bay Area (HayWired scenario). We have utilized various qualitative and quantitative methodologies—including telephone interviews, focus groups, online surveys, nonparticipant observation, and in-person survey distribution. The findings generated from these series of evaluations highlight the ways in which evaluation research can be used by researchers and academics to more appropriately address the needs of practitioners. Moreover, they contribute to knowledge enhancement surrounding disaster preparedness and risk communication, and, more generally, the limited body of knowledge about evaluation-focused disaster

  18. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy's Hanford Site

    NASA Astrophysics Data System (ADS)

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David

    2017-03-01

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.

  19. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy's Hanford Site.

    PubMed

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David

    2017-03-01

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.

  20. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology

    PubMed Central

    Zhang, Wen; Cao, Jieer

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789

  1. How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.

    PubMed

    Zhang, Wen; Cao, Jieer; Xu, Jun

    2017-01-01

    How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.

  2. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  3. Evaluation of HDL-modulating interventions for cardiovascular risk reduction using a systems pharmacology approach[S

    PubMed Central

    Gadkar, Kapil; Lu, James; Sahasranaman, Srikumar; Davis, John; Mazer, Norman A.; Ramanujan, Saroja

    2016-01-01

    The recent failures of cholesteryl ester transport protein inhibitor drugs to decrease CVD risk, despite raising HDL cholesterol (HDL-C) levels, suggest that pharmacologic increases in HDL-C may not always reflect elevations in reverse cholesterol transport (RCT), the process by which HDL is believed to exert its beneficial effects. HDL-modulating therapies can affect HDL properties beyond total HDL-C, including particle numbers, size, and composition, and may contribute differently to RCT and CVD risk. The lack of validated easily measurable pharmacodynamic markers to link drug effects to RCT, and ultimately to CVD risk, complicates target and compound selection and evaluation. In this work, we use a systems pharmacology model to contextualize the roles of different HDL targets in cholesterol metabolism and provide quantitative links between HDL-related measurements and the associated changes in RCT rate to support target and compound evaluation in drug development. By quantifying the amount of cholesterol removed from the periphery over the short-term, our simulations show the potential for infused HDL to treat acute CVD. For the primary prevention of CVD, our analysis suggests that the induction of ApoA-I synthesis may be a more viable approach, due to the long-term increase in RCT rate. PMID:26522778

  4. [Reconsidering evaluation criteria regarding health care research: toward an integrative framework of quantitative and qualitative criteria].

    PubMed

    Miyata, Hiroaki; Kai, Ichiro

    2006-05-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confused and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. It is therefore very important to reconsider evaluation criteria regarding rigor in social science. As Lincoln & Guba have already compared quantitative paradigms (validity, reliability, neutrality, generalizability) with qualitative paradigms (credibility, dependability, confirmability, transferability), we have discuss use of evaluation criteria based on pragmatic perspective. Validity/Credibility is the paradigm concerned to observational framework, while Reliability/Dependability refer to the range of stability in observations, Neutrality/Confirmability reflect influences between observers and subjects, Generalizability/Transferability have epistemological difference in the way findings are applied. Qualitative studies, however, does not always chose the qualitative paradigms. If we assume the stability to some extent, it is better to use the quantitative paradigm (reliability). Moreover as a quantitative study can not always guarantee a perfect observational framework, with stability in all phases of observations, it is useful to use qualitative paradigms to enhance the rigor in the study.

  5. Quantitative light-induced fluorescence technology for quantitative evaluation of tooth wear

    NASA Astrophysics Data System (ADS)

    Kim, Sang-Kyeom; Lee, Hyung-Suk; Park, Seok-Woo; Lee, Eun-Song; de Josselin de Jong, Elbert; Jung, Hoi-In; Kim, Baek-Il

    2017-12-01

    Various technologies used to objectively determine enamel thickness or dentin exposure have been suggested. However, most methods have clinical limitations. This study was conducted to confirm the potential of quantitative light-induced fluorescence (QLF) using autofluorescence intensity of occlusal surfaces of worn teeth according to enamel grinding depth in vitro. Sixteen permanent premolars were used. Each tooth was gradationally ground down at the occlusal surface in the apical direction. QLF-digital and swept-source optical coherence tomography images were acquired at each grinding depth (in steps of 100 μm). All QLF images were converted to 8-bit grayscale images to calculate the fluorescence intensity. The maximum brightness (MB) values of the same sound regions in grayscale images before (MB) and phased values after (MB) the grinding process were calculated. Finally, 13 samples were evaluated. MB increased over the grinding depth range with a strong correlation (r=0.994, P<0.001). In conclusion, the fluorescence intensity of the teeth and grinding depth was strongly correlated in the QLF images. Therefore, QLF technology may be a useful noninvasive tool used to monitor the progression of tooth wear and to conveniently estimate enamel thickness.

  6. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  7. Evaluation of HDL-modulating interventions for cardiovascular risk reduction using a systems pharmacology approach.

    PubMed

    Gadkar, Kapil; Lu, James; Sahasranaman, Srikumar; Davis, John; Mazer, Norman A; Ramanujan, Saroja

    2016-01-01

    The recent failures of cholesteryl ester transport protein inhibitor drugs to decrease CVD risk, despite raising HDL cholesterol (HDL-C) levels, suggest that pharmacologic increases in HDL-C may not always reflect elevations in reverse cholesterol transport (RCT), the process by which HDL is believed to exert its beneficial effects. HDL-modulating therapies can affect HDL properties beyond total HDL-C, including particle numbers, size, and composition, and may contribute differently to RCT and CVD risk. The lack of validated easily measurable pharmacodynamic markers to link drug effects to RCT, and ultimately to CVD risk, complicates target and compound selection and evaluation. In this work, we use a systems pharmacology model to contextualize the roles of different HDL targets in cholesterol metabolism and provide quantitative links between HDL-related measurements and the associated changes in RCT rate to support target and compound evaluation in drug development. By quantifying the amount of cholesterol removed from the periphery over the short-term, our simulations show the potential for infused HDL to treat acute CVD. For the primary prevention of CVD, our analysis suggests that the induction of ApoA-I synthesis may be a more viable approach, due to the long-term increase in RCT rate. Copyright © 2016 by the American Society for Biochemistry and Molecular Biology, Inc.

  8. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  9. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  10. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    PubMed

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  11. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  12. Quantitative evaluation of orbital hybridization in carbon nanotubes under radial deformation using π-orbital axis vector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohnishi, Masato, E-mail: masato.ohnishi@rift.mech.tohoku.ac.jp; Suzuki, Ken; Miura, Hideo, E-mail: hmiura@rift.mech.tohoku.ac.jp

    2015-04-15

    When a radial strain is applied to a carbon nanotube (CNT), the increase in local curvature induces orbital hybridization. The effect of the curvature-induced orbital hybridization on the electronic properties of CNTs, however, has not been evaluated quantitatively. In this study, the strength of orbital hybridization in CNTs under homogeneous radial strain was evaluated quantitatively. Our analyses revealed the detailed procedure of the change in electronic structure of CNTs. In addition, the dihedral angle, the angle between π-orbital axis vectors of adjacent atoms, was found to effectively predict the strength of local orbital hybridization in deformed CNTs.

  13. Feasibility of visual aids for risk evaluation by hospitalized patients with coronary artery disease: results from face-to-face interviews.

    PubMed

    Magliano, Carlos Alberto da Silva; Monteiro, Andrea Liborio; Tura, Bernardo Rangel; Oliveira, Claudia Silvia Rocha; Rebelo, Amanda Rebeca de Oliveira; Pereira, Claudia Cristina de Aguiar

    2018-01-01

    Communicating information about risk and probability to patients is considered a difficult task. In this study, we aim to evaluate the use of visual aids representing perioperative mortality and long-term survival in the communication process for patients diagnosed with coronary artery disease at the National Institute of Cardiology, a Brazilian public hospital specializing in cardiology. One-on-one interviews were conducted between August 1 and November 20, 2017. Patients were asked to imagine that their doctor was seeking their input in the decision regarding which treatment represented the best option for them. Patients were required to choose between alternatives by considering only the different benefits and risks shown in each scenario, described as the proportion of patients who had died during the perioperative period and within 5 years. Each participant evaluated the same eight scenarios. We evaluated their answers in a qualitative and quantitative analysis. The main findings were that all patients verbally expressed concern about perioperative mortality and that 25% did not express concern about long-term mortality. Twelve percent considered the probabilities irrelevant on the grounds that their prognosis would depend on "God's will." Ten percent of the patients disregarded the reported likelihood of perioperative mortality, deciding to focus solely on the "chance of being cured." In the quantitative analysis, the vast majority of respondents chose the "correct" alternatives, meaning that they made consistent and rational choices. The use of visual aids to present risk attributes appeared feasible in our sample. The impact of heuristics and religious beliefs on shared health decision making needs to be explored better in future studies.

  14. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  15. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  16. Risk evaluation of available phosphorus loss in agricultural land based on remote sensing and GIS

    NASA Astrophysics Data System (ADS)

    Ding, Xiaodong; Zhou, Bin; Xu, Junfeng; Liu, Ting; Xie, Bin

    2010-09-01

    The surplus of phosphorus leads to water eutrophication. Huge input of fertilizers in agricultural activities enriches nutrition in soil. The superfluous nutrient moves easily to riparian water by rainfall and surface runoff; leads to water eutrophication of riparian wetlands and downstream water; and consequently affects ecological balance. Thus it is significant to investigate the risk of phosphorus loss in agricultural land, to identify high concentration areas and guide the management of nutrition loss. This study was implemented mainly in the area of agricultural use in southern Western Australia, where a three-year period preliminary monitoring of water quality showed that the concentration of different forms of phosphorus in water had far exceeded the standard. Due to the large scale surface runoff caused by occasional storms in Western Australia, soil erosion was selected as the main driving factor for the loss of phosphorus. Remote sensing and ground truth data were used to reflect the seasonal changes of plants. The spatial distribution of available phosphorus was then predicted and combined with the evaluation matrix to evaluate the loss risk of phosphorus. This evaluation was based on quantitative rather than qualitative data to make better precision. It could help making decision support for monitoring water quality of rivers and riparian wetlands.

  17. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  18. A risk evaluation model and its application in online retailing trustfulness

    NASA Astrophysics Data System (ADS)

    Ye, Ruyi; Xu, Yingcheng

    2017-08-01

    Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.

  19. Analytical insight into "breathing" crack-induced acoustic nonlinearity with an application to quantitative evaluation of contact cracks.

    PubMed

    Wang, Kai; Liu, Menglong; Su, Zhongqing; Yuan, Shenfang; Fan, Zheng

    2018-08-01

    To characterize fatigue cracks, in the undersized stage in particular, preferably in a quantitative and precise manner, a two-dimensional (2D) analytical model is developed for interpreting the modulation mechanism of a "breathing" crack on guided ultrasonic waves (GUWs). In conjunction with a modal decomposition method and a variational principle-based algorithm, the model is capable of analytically depicting the propagating and evanescent waves induced owing to the interaction of probing GUWs with a "breathing" crack, and further extracting linear and nonlinear wave features (e.g., reflection, transmission, mode conversion and contact acoustic nonlinearity (CAN)). With the model, a quantitative correlation between CAN embodied in acquired GUWs and crack parameters (e.g., location and severity) is obtained, whereby a set of damage indices is proposed via which the severity of the crack can be evaluated quantitatively. The evaluation, in principle, does not entail a benchmarking process against baseline signals. As validation, the results obtained from the analytical model are compared with those from finite element simulation, showing good consistency. This has demonstrated accuracy of the developed analytical model in interpreting contact crack-induced CAN, and spotlighted its application to quantitative evaluation of fatigue damage. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Integrating quantitative and qualitative evaluation methods to compare two teacher inservice training programs

    NASA Astrophysics Data System (ADS)

    Lawrenz, Frances; McCreath, Heather

    Qualitative and quantitative evaluation procedures were used to compare two physical-science teacher inservice training programs. The two programs followed the master teacher training model espoused by NSF but used different types of master teachers and types of activities. The two evaluation procedures produced different results and together they provided a much clearer picture of the strengths and weaknesses of the two programs. Using only one approach or the other would have substantially altered the conclusions.

  1. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy’s Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy’s Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale frommore » nondiscernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy’s sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy’s sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.« less

  2. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  3. Quantitative structure-activity relationships for predicting potential ecological hazard of organic chemicals for use in regulatory risk assessments.

    PubMed

    Comber, Mike H I; Walker, John D; Watts, Chris; Hermens, Joop

    2003-08-01

    The use of quantitative structure-activity relationships (QSARs) for deriving the predicted no-effect concentration of discrete organic chemicals for the purposes of conducting a regulatory risk assessment in Europe and the United States is described. In the United States, under the Toxic Substances Control Act (TSCA), the TSCA Interagency Testing Committee and the U.S. Environmental Protection Agency (U.S. EPA) use SARs to estimate the hazards of existing and new chemicals. Within the Existing Substances Regulation in Europe, QSARs may be used for data evaluation, test strategy indications, and the identification and filling of data gaps. To illustrate where and when QSARs may be useful and when their use is more problematic, an example, methyl tertiary-butyl ether (MTBE), is given and the predicted and experimental data are compared. Improvements needed for new QSARs and tools for developing and using QSARs are discussed.

  4. Quantitative Evaluation Method of Each Generation Margin for Power System Planning

    NASA Astrophysics Data System (ADS)

    Su, Su; Tanaka, Kazuyuki

    As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.

  5. Evaluation of the "Respect Not Risk" firearm safety lesson for 3rd-graders.

    PubMed

    Liller, Karen D; Perrin, Karen; Nearns, Jodi; Pesce, Karen; Crane, Nancy B; Gonzalez, Robin R

    2003-12-01

    The purpose of this study was to evaluate the MORE HEALTH "Respect Not Risk" Firearm Safety Lesson for 3rd-graders in Pinellas County, Florida. Six schools representative of various socioeconomic levels were selected as the test sites. Qualitative and quantitative data were collected. A total of 433 matched pretests/posttests were used to determine the effectiveness of the class in increasing student knowledge about firearm safety. The results revealed a significant increase in the mean scores on the posttest compared with the pretest. Qualitative findings showed the lesson was positively received by both students and teachers, and 65% of responding students reported discussing the lesson with family members. School nurses are encouraged to take a leading role in promoting firearm injury prevention to students.

  6. Analyzing the impacts of global trade and investment on non-communicable diseases and risk factors: a critical review of methodological approaches used in quantitative analyses.

    PubMed

    Cowling, Krycia; Thow, Anne Marie; Pollack Porter, Keshia

    2018-05-24

    A key mechanism through which globalization has impacted health is the liberalization of trade and investment, yet relatively few studies to date have used quantitative methods to investigate the impacts of global trade and investment policies on non-communicable diseases and risk factors. Recent reviews of this literature have found heterogeneity in results and a range of quality across studies, which may be in part attributable to a lack of conceptual clarity and methodological inconsistencies. This study is a critical review of methodological approaches used in the quantitative literature on global trade and investment and diet, tobacco, alcohol, and related health outcomes, with the objective of developing recommendations and providing resources to guide future robust, policy relevant research. A review of reviews, expert review, and reference tracing were employed to identify relevant studies, which were evaluated using a novel quality assessment tool designed for this research. Eight review articles and 34 quantitative studies were identified for inclusion. Important ways to improve this literature were identified and discussed: clearly defining exposures of interest and not conflating trade and investment; exploring mechanisms of broader relationships; increasing the use of individual-level data; ensuring consensus and consistency in key confounding variables; utilizing more sector-specific versus economy-wide trade and investment indicators; testing and adequately adjusting for autocorrelation and endogeneity when using longitudinal data; and presenting results from alternative statistical models and sensitivity analyses. To guide the development of future analyses, recommendations for international data sources for selected trade and investment indicators, as well as key gaps in the literature, are presented. More methodologically rigorous and consistent approaches in future quantitative studies on the impacts of global trade and investment policies on non

  7. Communicating quantitative risks and benefits in promotional prescription drug labeling or print advertising.

    PubMed

    West, Suzanne L; Squiers, Linda B; McCormack, Lauren; Southwell, Brian G; Brouwer, Emily S; Ashok, Mahima; Lux, Linda; Boudewyns, Vanessa; O'Donoghue, Amie; Sullivan, Helen W

    2013-05-01

    Under the Food, Drug, and Cosmetic Act, all promotional materials for prescription drugs must strike a fair balance in presentation of risks and benefits. How to best present this information is not clear. We sought to determine if the presentation of quantitative risk and benefit information in drug advertising and labeling influences consumers', patients', and clinicians' information processing, knowledge, and behavior by assessing available empirical evidence. We used PubMed for a literature search, limiting to articles published in English from 1990 forward. Two reviewers independently reviewed the titles and abstracts for inclusion, after which we reviewed the full texts to determine if they communicated risk/benefit information either: (i) numerically (e.g., percent) versus non-numerically (e.g., using text such as "increased risk") or (ii) numerically using different formats (e.g., "25% of patients", "one in four patients", or use of pictographs). We abstracted information from included articles into standardized evidence tables. The research team identified a total of 674 relevant publications, of which 52 met our inclusion criteria. Of these, 37 focused on drugs. Presenting numeric information appears to improve understanding of risks and benefits relative to non-numeric presentation; presenting both numeric and non-numeric information when possible may be best practice. No single specific format or graphical approach emerged as consistently superior. Numeracy and health literacy also deserve more empirical attention as moderators. Copyright © 2013 John Wiley & Sons, Ltd.

  8. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    PubMed

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  9. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  10. Prospective evaluation of risk of vertebral fractures using quantitative ultrasound measurements and bone mineral density in a population-based sample of postmenopausal women: results of the Basel Osteoporosis Study.

    PubMed

    Hollaender, R; Hartl, F; Krieg, M-A; Tyndall, A; Geuckel, C; Buitrago-Tellez, C; Manghani, M; Kraenzlin, M; Theiler, R; Hans, D

    2009-03-01

    Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

  11. Quantitative microbial risk assessment to estimate the health risk from exposure to noroviruses in polluted surface water in South Africa.

    PubMed

    Van Abel, Nicole; Mans, Janet; Taylor, Maureen B

    2017-10-01

    This study assessed the risks posed by noroviruses (NoVs) in surface water used for drinking, domestic, and recreational purposes in South Africa (SA), using a quantitative microbial risk assessment (QMRA) methodology that took a probabilistic approach coupling an exposure assessment with four dose-response models to account for uncertainty. Water samples from three rivers were found to be contaminated with NoV GI (80-1,900 gc/L) and GII (420-9,760 gc/L) leading to risk estimates that were lower for GI than GII. The volume of water consumed and the probabilities of infection were lower for domestic (2.91 × 10 -8 to 5.19 × 10 -1 ) than drinking water exposures (1.04 × 10 -5 to 7.24 × 10 -1 ). The annual probabilities of illness varied depending on the type of recreational water exposure with boating (3.91 × 10 -6 to 5.43 × 10 -1 ) and swimming (6.20 × 10 -6 to 6.42 × 10 -1 ) being slightly greater than playing next to/in the river (5.30 × 10 -7 to 5.48 × 10 -1 ). The QMRA was sensitive to the choice of dose-response model. The risk of NoV infection or illness from contaminated surface water is extremely high in SA, especially for lower socioeconomic individuals, but is similar to reported risks from limited international studies.

  12. Evaluation of End-User Satisfaction Among Employees Participating in a Web-based Health Risk Assessment With Tailored Feedback

    PubMed Central

    Colkesen, Ersen B; Niessen, Maurice AJ; Kraaijenhagen, Roderik A; Essink-Bot, Marie-Louise; Peek, Niels

    2012-01-01

    Background Web technology is increasingly being used to provide individuals with health risk assessments (HRAs) with tailored feedback. End-user satisfaction is an important determinant of the potential impact of HRAs, as this influences program attrition and adherence to behavioral advice. Objective The aim of this study was to evaluate end-user satisfaction with a web-based HRA with tailored feedback applied in worksite settings, using mixed (quantitative and qualitative) methods. Methods Employees of seven companies in the Netherlands participated in a commercial, web-based, HRA with tailored feedback. The HRA consisted of four components: 1) a health and lifestyle assessment questionnaire, 2) a biometric evaluation, 3) a laboratory evaluation, and 4) tailored feedback consisting of a personal health risk profile and lifestyle behavior advice communicated through a web portal. HRA respondents received an evaluation questionnaire after six weeks. Satisfaction with different parts of the HRA was measured on 5-point Likert scales. A free-text field provided the opportunity to make additional comments. Results In total, 2289 employees participated in the HRA program, of which 637 (27.8%) completed the evaluation questionnaire. Quantitative analysis showed that 85.6% of the respondents evaluated the overall HRA positively. The free-text field was filled in by 29.7 % of the respondents (189 out of 637), who made 315 separate remarks. Qualitative evaluation of these data showed that these respondents made critical remarks. Respondents felt restricted by the answer categories of the health and lifestyle assessment questionnaire, which resulted in the feeling that the corresponding feedback could be inadequate. Some respondents perceived the personal risk profile as unnecessarily alarming or suggested providing more explanations, reference values, and a justification of the behavioral advice given. Respondents also requested the opportunity to discuss the feedback with a

  13. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  14. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  15. Evaluation of the Prostate Cancer Prevention Trial Risk Calculator in a High-Risk Screening Population

    PubMed Central

    Kaplan, David J.; Boorjian, Stephen A.; Ruth, Karen; Egleston, Brian L.; Chen, David Y.T.; Viterbo, Rosalia; Uzzo, Robert G.; Buyyounouski, Mark K.; Raysor, Susan; Giri, Veda N.

    2009-01-01

    Introduction Clinical factors in addition to PSA have been evaluated to improve risk assessment for prostate cancer. The Prostate Cancer Prevention Trial (PCPT) risk calculator provides an assessment of prostate cancer risk based on age, PSA, race, prior biopsy, and family history. This study evaluated the risk calculator in a screening cohort of young, racially diverse, high-risk men with a low baseline PSA enrolled in the Prostate Cancer Risk Assessment Program. Patients and Methods Eligibility for PRAP include men ages 35-69 who are African-American, have a family history of prostate cancer, or have a known BRCA1/2 mutation. PCPT risk scores were determined for PRAP participants, and were compared to observed prostate cancer rates. Results 624 participants were evaluated, including 382 (61.2%) African-American men and 375 (60%) men with a family history of prostate cancer. Median age was 49.0 years (range 34.0-69.0), and median PSA was 0.9 (range 0.1-27.2). PCPT risk score correlated with prostate cancer diagnosis, as the median baseline risk score in patients diagnosed with prostate cancer was 31.3%, versus 14.2% in patients not diagnosed with prostate cancer (p<0.0001). The PCPT calculator similarly stratified the risk of diagnosis of Gleason score ≥7 disease, as the median risk score was 36.2% in patients diagnosed with Gleason ≥7 prostate cancer versus 15.2% in all other participants (p<0.0001). Conclusion PCPT risk calculator score was found to stratify prostate cancer risk in a cohort of young, primarily African-American men with a low baseline PSA. These results support further evaluation of this predictive tool for prostate cancer risk assessment in high-risk men. PMID:19709072

  16. Comparative analysis of quantitative efficiency evaluation methods for transportation networks.

    PubMed

    He, Yuxin; Qin, Jin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.

  17. Quantitative risk assessment of human salmonellosis in Canadian broiler chicken breast from retail to consumption.

    PubMed

    Smadi, Hanan; Sargeant, Jan M

    2013-02-01

    The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.

  18. IWGT report on quantitative approaches to genotoxicity risk assessment II. Use of point-of-departure (PoD) metrics in defining acceptable exposure limits and assessing human risk

    EPA Science Inventory

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the ne...

  19. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  20. Quantitative Risk Assessment of Antimicrobial-Resistant Foodborne Infections in Humans Due to Recombinant Bovine Somatotropin Usage in Dairy Cows.

    PubMed

    Singer, Randall S; Ruegg, Pamela L; Bauman, Dale E

    2017-07-01

    Recombinant bovine somatotropin (rbST) is a production-enhancing technology that allows the dairy industry to produce milk more efficiently. Concern has been raised that cows supplemented with rbST are at an increased risk of developing clinical mastitis, which would potentially increase the use of antimicrobial agents and increase human illnesses associated with antimicrobial-resistant bacterial pathogens delivered through the dairy beef supply. The purpose of this study was to conduct a quantitative risk assessment to estimate the potential increased risk of human infection with antimicrobial-resistant bacteria and subsequent adverse health outcomes as a result of rbST usage in dairy cattle. The quantitative risk assessment included the following steps: (i) release of antimicrobial-resistant organisms from the farm, (ii) exposure of humans via consumption of contaminated beef products, and (iii) consequence of the antimicrobial-resistant infection. The model focused on ceftiofur (parenteral and intramammary) and oxytetracycline (parenteral) treatment of clinical mastitis in dairy cattle and tracked the bacteria Campylobacter spp., Salmonella enterica subsp. enterica, and Escherichia coli in the gastrointestinal tract of the cow. Parameter estimates were developed to be maximum risk to overestimate the risk to humans. The excess number of cows in the U.S. dairy herd that were predicted to carry resistant bacteria at slaughter due to rbST administration was negligible. The total number of excess human illnesses caused by resistant bacteria due to rbST administration was also predicted to be negligible with all risks considerably less than one event per 1 billion people at risk per year for all bacteria. The results indicate a high probability that the use of rbST according to label instructions presents a negligible risk for increasing the number of human illnesses and subsequent adverse outcomes associated with antimicrobial-resistant Campylobacter, Salmonella, or

  1. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    PubMed

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the

  2. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  3. Evaluating STORM skills training for managing people at risk of suicide.

    PubMed

    Gask, Linda; Dixon, Clare; Morriss, Richard; Appleby, Louis; Green, Gillian

    2006-06-01

    This paper reports a study evaluating the Skills Training On Risk Management (STORM) training initiative in three mental health services in the North-West of England, UK. Training for health workers has been widely advocated as a key route to suicide prevention. However, reports of evaluations are scarce in the literature. In previous research, we have demonstrated that the STORM intervention results in acquisition of new skills and can be disseminated in a community setting. The training was delivered during a 6-month period in 2002 by three mental health nurses who were seconded part-time to the project. The quantitative evaluation, which assessed change in attitudes, confidence, acquisition of skills and satisfaction, used a pretest/post-test design, with participants acting as their own controls. Qualitative interviews were conducted with a purposive sample of 16 participants to explore the impact on clinical practice, and with the three trainers at the end of the study. Data from 458 staff members were collected during a 6-month period. Positive changes in attitudes and confidence were shown, but previous evidence of skill acquisition was not replicated. Qualitative interviews revealed important insights into changes in clinical practice, particularly for less experienced or unqualified nursing staff, but also concerns about the lack of an educational culture to foster and support such interventions in practice within the organizations. STORM training for the assessment and management of suicide risk is both feasible and acceptable in mental health trusts. However, we remain uncertain of its longer-term impact, given the lack of engagement of senior staff in the enterprise and the absence of linked supervision and support from the organizational management to reinforce skill acquisition and development. We consider that regular supervision that links STORM training to actual clinical experience would be the ideal.

  4. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2013-01-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  5. Quantitative oxygen concentration imaging in toluene atmospheres using Dual Imaging with Modeling Evaluation

    NASA Astrophysics Data System (ADS)

    Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim

    2012-12-01

    Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.

  6. Interactive graphics for expressing health risks: development and qualitative evaluation.

    PubMed

    Ancker, Jessica S; Chan, Connie; Kukafka, Rita

    2009-01-01

    Recent findings suggest that interactive game-like graphics might be useful in communicating probabilities. We developed a prototype for a risk communication module, focusing on eliciting users' preferences for different interactive graphics and assessing usability and user interpretations. Feedback from five focus groups was used to design the graphics. The final version displayed a matrix of square buttons; clicking on any button allowed the user to see whether the stick figure underneath was affected by the health outcome. When participants used this interaction to learn about a risk, they expressed more emotional responses, both positive and negative, than when viewing any static graphic or numerical description of a risk. Their responses included relief about small risks and concern about large risks. The groups also commented on static graphics: arranging the figures affected by disease randomly throughout a group of figures made it more difficult to judge the proportion affected but often was described as more realistic. Interactive graphics appear to have potential for expressing risk magnitude as well as the feeling of risk. This affective impact could be useful in increasing perceived threat of high risks, calming fears about low risks, or comparing risks. Quantitative studies are planned to assess the effect on perceived risks and estimated risk magnitudes.

  7. Credit Risk Evaluation of Power Market Players with Random Forest

    NASA Astrophysics Data System (ADS)

    Umezawa, Yasushi; Mori, Hiroyuki

    A new method is proposed for credit risk evaluation in a power market. The credit risk evaluation is to measure the bankruptcy risk of the company. The power system liberalization results in new environment that puts emphasis on the profit maximization and the risk minimization. There is a high probability that the electricity transaction causes a risk between companies. So, power market players are concerned with the risk minimization. As a management strategy, a risk index is requested to evaluate the worth of the business partner. This paper proposes a new method for evaluating the credit risk with Random Forest (RF) that makes ensemble learning for the decision tree. RF is one of efficient data mining technique in clustering data and extracting relationship between input and output data. In addition, the method of generating pseudo-measurements is proposed to improve the performance of RF. The proposed method is successfully applied to real financial data of energy utilities in the power market. A comparison is made between the proposed and the conventional methods.

  8. A Quantitative Risk-Benefit Analysis of Prophylactic Surgery Prior to Extended-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Carroll, Danielle; Reyes, David; Kerstman, Eric; Walton, Marlei; Antonsen, Erik

    2017-01-01

    INTRODUCTION: Among otherwise healthy astronauts undertaking deep space missions, the risks for acute appendicitis (AA) and cholecystitis (AC) are not zero. If these conditions were to occur during spaceflight they may require surgery for definitive care. The proposed study quantifies and compares the risks of developing de novo AA and AC in-flight to the surgical risks of prophylactic laparoscopic appendectomy (LA) and cholecystectomy (LC) using NASA's Integrated Medical Model (IMM). METHODS: The IMM is a Monte Carlo simulation that forecasts medical events during spaceflight missions and estimates the impact of these medical events on crew health. In this study, four Design Reference Missions (DRMs) were created to assess the probability of an astronaut developing in-flight small-bowel obstruction (SBO) following prophylactic 1) LA, 2) LC, 3) LA and LC, or 4) neither surgery (SR# S-20160407-351). Model inputs were drawn from a large, population-based 2011 Swedish study that examined the incidence and risks of post-operative SBO over a 5-year follow-up period. The study group included 1,152 patients who underwent LA, and 16,371 who underwent LC. RESULTS: Preliminary results indicate that prophylactic LA may yield higher mission risks than the control DRM. Complete analyses are pending and will be subsequently available. DISCUSSION: The risk versus benefits of prophylactic surgery in astronauts to decrease the probability of acute surgical events during spaceflight has only been qualitatively examined in prior studies. Within the assumptions and limitations of the IMM, this work provides the first quantitative guidance that has previously been lacking to this important question for future deep space exploration missions.

  9. Risk Evaluation in the Pre-Phase A Conceptual Design of Spacecraft

    NASA Technical Reports Server (NTRS)

    Fabisinski, Leo L., III; Maples, Charlotte Dauphne

    2010-01-01

    Typically, the most important decisions in the design of a spacecraft are made in the earliest stages of its conceptual design the Pre-Phase A stages. It is in these stages that the greatest number of design alternatives is considered, and the greatest number of alternatives is rejected. The focus of Pre-Phase A conceptual development is on the evaluation and comparison of whole concepts and the larger-scale systems comprising those concepts. This comparison typically uses general Figures of Merit (FOMs) to quantify the comparative benefits of designs and alternative design features. Along with mass, performance, and cost, risk should be one of the major FOMs in evaluating design decisions during the conceptual design phases. However, risk is often given inadequate consideration in conceptual design practice. The reasons frequently given for this lack of attention to risk include: inadequate mission definition, lack of rigorous design requirements in early concept phases, lack of fidelity in risk assessment methods, and under-evaluation of risk as a viable FOM for design evaluation. In this paper, the role of risk evaluation in early conceptual design is discussed. The various requirements of a viable risk evaluation tool at the Pre-Phase A level are considered in light of the needs of a typical spacecraft design study. A technique for risk identification and evaluation is presented. The application of the risk identification and evaluation approach to the conceptual design process is discussed. Finally, a computational tool for risk profiling is presented and applied to assess the risk for an existing Pre-Phase A proposal. The resulting profile is compared to the risks identified for the proposal by other means.

  10. Polymer on Top: Current Limits and Future Perspectives of Quantitatively Evaluating Surface Grafting.

    PubMed

    Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher

    2018-03-07

    Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  12. Quantitative Experimental Determination of Primer-Dimer Formation Risk by Free-Solution Conjugate Electrophoresis

    PubMed Central

    Desmarais, Samantha M.; Leitner, Thomas; Barron, Annelise E.

    2012-01-01

    DNA barcodes are short, unique ssDNA primers that “mark” individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis (FSCE) approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer-barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 basepairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive basepairs formed, yet non-consecutive basepairs did not create stable dimers even when 20 out of 30 possible basepairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation. PMID:22331820

  13. Quantitative Acoustic Model for Adhesion Evaluation of Pmma/silicon Film Structures

    NASA Astrophysics Data System (ADS)

    Ju, H. S.; Tittmann, B. R.

    2010-02-01

    A Poly-methyl-methacrylate (PMMA) film on a silicon substrate is a main structure for photolithography in semiconductor manufacturing processes. This paper presents a potential of scanning acoustic microscopy (SAM) for nondestructive evaluation of the PMMA/Si film structure, whose adhesion failure is commonly encountered during the fabrication and post-fabrication processes. A physical model employing a partial discontinuity in displacement is developed for rigorously quantitative evaluation of the interfacial weakness. The model is implanted to the matrix method for the surface acoustic wave (SAW) propagation in anisotropic media. Our results show that variations in the SAW velocity and reflectance are predicted to show their sensitivity to the adhesion condition. Experimental results by the v(z) technique and SAW velocity reconstruction verify the prediction.

  14. Quantitative approach for the risk assessment of African swine fever and Classical swine fever introduction into the United States through legal imports of pigs and swine products.

    PubMed

    Herrera-Ibatá, Diana María; Martínez-López, Beatriz; Quijada, Darla; Burton, Kenneth; Mur, Lina

    2017-01-01

    The US livestock safety strongly depends on its capacity to prevent the introduction of Transboundary Animal Diseases (TADs). Therefore, accurate and updated information on the location and origin of those potential TADs risks is essential, so preventive measures as market restrictions can be put on place. The objective of the present study was to evaluate the current risk of African swine fever (ASF) and Classical swine fever (CSF) introduction into the US through the legal importations of live pigs and swine products using a quantitative approach that could be later applied to other risks. Four quantitative stochastic risk assessment models were developed to estimate the monthly probabilities of ASF and CSF release into the US, and the exposure of susceptible populations (domestic and feral swine) to these introductions at state level. The results suggest a low annual probability of either ASF or CSF introduction into the US, by any of the analyzed pathways (5.5*10-3). Being the probability of introduction through legal imports of live pigs (1.8*10-3 for ASF, and 2.5*10-3 for CSF) higher than the risk of legally imported swine products (8.90*10-4 for ASF, and 1.56*10-3 for CSF). This could be caused due to the low probability of exposure associated with this type of commodity (products). The risk of feral pigs accessing to swine products discarded in landfills was slightly higher than the potential exposure of domestic pigs through swill feeding. The identification of the months at highest risk, the origin of the higher risk imports, and the location of the US states most vulnerable to those introductions (Iowa, Minnesota and Wisconsin for live swine and California, Florida and Texas for swine products), is valuable information that would help to design prevention, risk-mitigation and early-detection strategies that would help to minimize the catastrophic consequences of potential ASF/CSF introductions into the US.

  15. Quantitative approach for the risk assessment of African swine fever and Classical swine fever introduction into the United States through legal imports of pigs and swine products

    PubMed Central

    Herrera-Ibatá, Diana María; Martínez-López, Beatriz; Quijada, Darla; Burton, Kenneth

    2017-01-01

    The US livestock safety strongly depends on its capacity to prevent the introduction of Transboundary Animal Diseases (TADs). Therefore, accurate and updated information on the location and origin of those potential TADs risks is essential, so preventive measures as market restrictions can be put on place. The objective of the present study was to evaluate the current risk of African swine fever (ASF) and Classical swine fever (CSF) introduction into the US through the legal importations of live pigs and swine products using a quantitative approach that could be later applied to other risks. Four quantitative stochastic risk assessment models were developed to estimate the monthly probabilities of ASF and CSF release into the US, and the exposure of susceptible populations (domestic and feral swine) to these introductions at state level. The results suggest a low annual probability of either ASF or CSF introduction into the US, by any of the analyzed pathways (5.5*10−3). Being the probability of introduction through legal imports of live pigs (1.8*10−3 for ASF, and 2.5*10−3 for CSF) higher than the risk of legally imported swine products (8.90*10−4 for ASF, and 1.56*10−3 for CSF). This could be caused due to the low probability of exposure associated with this type of commodity (products). The risk of feral pigs accessing to swine products discarded in landfills was slightly higher than the potential exposure of domestic pigs through swill feeding. The identification of the months at highest risk, the origin of the higher risk imports, and the location of the US states most vulnerable to those introductions (Iowa, Minnesota and Wisconsin for live swine and California, Florida and Texas for swine products), is valuable information that would help to design prevention, risk-mitigation and early-detection strategies that would help to minimize the catastrophic consequences of potential ASF/CSF introductions into the US. PMID:28797058

  16. Quantitative Risk - Phase 1

    DTIC Science & Technology

    2013-09-03

    SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle ......................................... 25

  17. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  18. Assessment Tools for the Evaluation of Risk

    EPA Science Inventory

    ASTER (Assessment Tools for the Evaluation of Risk) was developed by the U.S. EPA Mid-Continent Ecology Division, Duluth, MN to assist regulators in performing ecological risk assessments. ASTER is an integration of the ECOTOXicology Database (ECOTOX; A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  19. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  1. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  2. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  3. A quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sex among HIV-seropositive men.

    PubMed

    Gerbi, Gemechu B; Habtemariam, Tsegaye; Tameru, Berhanu; Nganwa, David; Robnett, Vinaida

    2012-01-01

    The objective of this study is to conduct a quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sexual practices among HIV-seropositive men. A knowledgebase was developed by reviewing different published sources. The data were collected from different sources including Centers for Disease Control and Prevention, selected journals, and reports. The risk pathway scenario tree was developed based on a comprehensive review of published literature. The variables are organized into nine major parameter categories. Monte Carlo simulations for the quantitative risk assessment of HIV/AIDS transmission was executed with the software @Risk 4.0 (Palisade Corporation). Results show that the value for the likelihood of unprotected sex due to having less knowledge about HIV/AIDS and negative attitude toward condom use and safer sex ranged from 1.24 × 10(-5) to 8.47 × 10(-4) with the mean and standard deviation of 1.83 × 10(-4) and 8.63 × 10(-5), respectively. The likelihood of unprotected sex due to having greater anger-hostility, anxiety, less satisfied with aspects of life, and greater depressive symptoms ranged from 2.76 × 10(-9) to 5.34 × 10(-7) with the mean and standard deviation of 5.23 × 10(-8) and 3.58 × 10(-8), respectively. The findings suggest that HIV/AIDS research and intervention programs must be focused on behavior, and the broader setting within which individual risky behaviors occur.

  4. Quantitative evaluation of optically induced disorientation.

    DOT National Transportation Integrated Search

    1970-01-01

    The purpose of this study was to establish quantitatively and systematically the association between the speed of movement of an optical environment and the extent of disorientation experienced by an individual viewing this environment. The degree of...

  5. A TEM quantitative evaluation of strengthening in an Mg-RE alloy reinforced with SiC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabibbo, Marcello, E-mail: m.cabibbo@univpm.it; Spigarelli, Stefano

    2011-10-15

    Magnesium alloys containing rare earth elements are known to have high specific strength, good creep and corrosion resistance up to 523 K. The addition of SiC ceramic particles strengthens the metal matrix composite resulting in better wear and creep resistance while maintaining good machinability. The role of the reinforcement particles in enhancing strength can be quantitatively evaluated using transmission electron microscopy (TEM). This paper presents a quantitative evaluation of the different strengthening contributions, determined through TEM inspections, in an SiC Mg-RE composite alloy containing yttrium, neodymium, gadolinium and dysprosium. Compression tests at temperatures ranging between 290 and 573 K weremore » carried out. The microstructure strengthening mechanism was studied for all the compression conditions. Strengthening was compared to the mechanical results and the way the different contributions were combined is also discussed and justified. - Research Highlights: {yields} TEM yield strengthening terms evaluation on a Mg-RE SiC alloy. {yields} The evaluation has been extended to different compression temperature conditions. {yields} Linear and Quadratic sum has been proposed and validated. {yields} Hall-Petch was found to be the most prominent strengthening contributions.« less

  6. A quantitative assessment of the risk for highly pathogenic avian influenza introduction into Spain via legal trade of live poultry.

    PubMed

    Sánchez-Vizcaíno, Fernando; Perez, Andrés; Lainez, Manuel; Sánchez-Vizcaíno, José Manuel

    2010-05-01

    Highly pathogenic avian influenza (HPAI) is considered one of the most important diseases of poultry. During the last 9 years, HPAI epidemics have been reported in Asia, the Americas, Africa, and in 18 countries of the European Union (EU). For that reason, it is possible that the risk for HPAI virus (HPAIV) introduction into Spain may have recently increased. Because of the EU free-trade policy and because legal trade of live poultry was considered an important route for HPAI spread in certain regions of the world, there are fears that Spain may become HPAIV-infected as a consequence of the legal introduction of live poultry. However, no quantitative assessment of the risk for HPAIV introduction into Spain or into any other EU member state via the trade of poultry has been published in the peer-reviewed literature. This article presents the results of the first quantitative assessment of the risk for HPAIV introduction into a free country via legal trade of live poultry, along with estimates of the geographical variation of the risk and of the relative contribution of exporting countries and susceptible poultry species to the risk. The annual mean risk for HPAI introduction into Spain was estimated to be as low as 1.36 x 10(-3), suggesting that under prevailing conditions, introduction of HPAIV into Spain through the trade of live poultry is unlikely to occur. Moreover, these results support the hypothesis that legal trade of live poultry does not impose a significant risk for the spread of HPAI into EU member states.

  7. Occurrence and quantitative microbial risk assessment of Cryptosporidium and Giardia in soil and air samples.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Mondaca-Fernández, Iram; Balderas-Cortés, José de Jesús; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2014-09-01

    Cryptosporidium oocysts and Giardia cysts can be transmitted by the fecal-oral route and may cause gastrointestinal parasitic zoonoses. These zoonoses are common in rural zones due to the parasites being harbored in fecally contaminated soil. This study assessed the risk of illness (giardiasis and cryptosporidiosis) from inhaling and/or ingesting soil and/or airborne dust in Potam, Mexico. To assess the risk of infection, Quantitative Microbial Risk Assessment (QMRA) was employed, with the following steps: (1) hazard identification, (2) hazard exposure, (3) dose-response, and (4) risk characterization. Cryptosporidium oocysts and Giardia cysts were observed in 52% and 57%, respectively, of total soil samples (n=21), and in 60% and 80%, respectively, of air samples (n=12). The calculated annual risks were higher than 9.9 × 10(-1) for both parasites in both types of sample. Soil and air inhalation and/or ingestion are important vehicles for these parasites. To our knowledge, the results obtained in the present study represent the first QMRAs for cryptosporidiosis and giardiasis due to soil and air inhalation/ingestion in Mexico. In addition, this is the first evidence of the microbial air quality around these parasites in rural zones. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Comparative analysis of quantitative efficiency evaluation methods for transportation networks

    PubMed Central

    He, Yuxin; Hong, Jian

    2017-01-01

    An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165

  9. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  11. The Biomarker-Surrogacy Evaluation Schema: a review of the biomarker-surrogate literature and a proposal for a criterion-based, quantitative, multidimensional hierarchical levels of evidence schema for evaluating the status of biomarkers as surrogate endpoints.

    PubMed

    Lassere, Marissa N

    2008-06-01

    There are clear advantages to using biomarkers and surrogate endpoints, but concerns about clinical and statistical validity and systematic methods to evaluate these aspects hinder their efficient application. Section 2 is a systematic, historical review of the biomarker-surrogate endpoint literature with special reference to the nomenclature, the systems of classification and statistical methods developed for their evaluation. In Section 3 an explicit, criterion-based, quantitative, multidimensional hierarchical levels of evidence schema - Biomarker-Surrogacy Evaluation Schema - is proposed to evaluate and co-ordinate the multiple dimensions (biological, epidemiological, statistical, clinical trial and risk-benefit evidence) of the biomarker clinical endpoint relationships. The schema systematically evaluates and ranks the surrogacy status of biomarkers and surrogate endpoints using defined levels of evidence. The schema incorporates the three independent domains: Study Design, Target Outcome and Statistical Evaluation. Each domain has items ranked from zero to five. An additional category called Penalties incorporates additional considerations of biological plausibility, risk-benefit and generalizability. The total score (0-15) determines the level of evidence, with Level 1 the strongest and Level 5 the weakest. The term ;surrogate' is restricted to markers attaining Levels 1 or 2 only. Surrogacy status of markers can then be directly compared within and across different areas of medicine to guide individual, trial-based or drug-development decisions. This schema would facilitate communication between clinical, researcher, regulatory, industry and consumer participants necessary for evaluation of the biomarker-surrogate-clinical endpoint relationship in their different settings.

  12. In vivo quantitative evaluation of tooth color with hand-held colorimeter and custom template.

    PubMed

    Shimada, Kazuki; Kakehashi, Yoshiyuki; Matsumura, Hideo; Tanoue, Naomi

    2004-04-01

    This article presents a technique for quantitatively evaluating the color of teeth, as well as color change in restorations and tooth surfaces. Through use of a custom template made of a thermoplastic polymer and a dental colorimeter, tooth surface color can be recorded periodically at the same location intraorally.

  13. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  14. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  15. Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: a systematic map.

    PubMed

    Daigneault, Pierre-Marc

    2014-08-01

    Stakeholder participation and evaluation use have attracted a lot of attention from practitioners, theorists and researchers. A common hypothesis is that participation is positively associated with evaluation use. Whereas the number of empirical studies conducted on this topic is impressive, quantitative research has held a minority position within this scientific production. This study mobilizes systematic review methods to 'map' the empirical literature that has quantitatively studied participation and use. The goal is to take stock and assess the strength of evidence of this literature (but not to synthesize the findings) and, based on this assessment, to provide directions for future research. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  17. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens.

    PubMed

    Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B

    2003-05-25

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the

  18. Methods for quantitative and qualitative evaluation of vaginal microflora during menstruation.

    PubMed Central

    Onderdonk, A B; Zamarchi, G R; Walsh, J A; Mellor, R D; Muñoz, A; Kass, E H

    1986-01-01

    The quantitative and qualitative changes in the bacterial flora of the vagina during menstruation have received inadequate study. Similarly, the effect of vaginal tampons on the microbial flora as well as the relationship between the microbial flora of the vagina and that of the tampon has not been adequately evaluated. The purposes of the present study were (i) to develop quantitative methods for studying the vaginal flora and the flora of tampons obtained during menstruation and (ii) to determine whether there were differences between the microflora of the tampon and that of the vaginal vault. Tampon and swab samples were obtained at various times from eight young healthy volunteers for 8 to 10 menstrual cycles. Samples consisted of swabs from women wearing menstrual pads compared with swab and tampon samples taken at various times during the menstrual cycle. Samples were analyzed for total facultative and anaerobic bacterial counts, and the six dominant bacterial species in each culture were identified. Statistical evaluation of the results indicates that total bacterial counts decreased during menstruation and that swab and tampon samples yielded similar total counts per unit weight of sample. The numbers of bacteria in tampons tended to be lower than in swabs taken at the same time. Overall, during menstruation, the concentrations of lactobacilli declined, but otherwise there was little difference among the species found during menstruation compared with those found in intermenstrual samples. Cotton tampons had little discernible effect on the microbial flora. PMID:3954346

  19. Model-based approach for quantitative estimates of skin, heart, and lung toxicity risk for left-side photon and proton irradiation after breast-conserving surgery.

    PubMed

    Tommasino, Francesco; Durante, Marco; D'Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Farace, Paolo; Palma, Giuseppe; Schwarz, Marco; Cella, Laura; Pacelli, Roberto

    2017-05-01

    Proton beam therapy represents a promising modality for left-side breast cancer (BC) treatment, but concerns have been raised about skin toxicity and poor cosmesis. The aim of this study is to apply skin normal tissue complication probability (NTCP) model for intensity modulated proton therapy (IMPT) optimization in left-side BC. Ten left-side BC patients undergoing photon irradiation after breast-conserving surgery were randomly selected from our clinical database. Intensity modulated photon (IMRT) and IMPT plans were calculated with iso-tumor-coverage criteria and according to RTOG 1005 guidelines. Proton plans were computed with and without skin optimization. Published NTCP models were employed to estimate the risk of different toxicity endpoints for skin, lung, heart and its substructures. Acute skin NTCP evaluation suggests a lower toxicity level with IMPT compared to IMRT when the skin is included in proton optimization strategy (0.1% versus 1.7%, p < 0.001). Dosimetric results show that, with the same level of tumor coverage, IMPT attains significant heart and lung dose sparing compared with IMRT. By NTCP model-based analysis, an overall reduction in the cardiopulmonary toxicity risk prediction can be observed for all IMPT compared to IMRT plans: the relative risk reduction from protons varies between 0.1 and 0.7 depending on the considered toxicity endpoint. Our analysis suggests that IMPT might be safely applied without increasing the risk of severe acute radiation induced skin toxicity. The quantitative risk estimates also support the potential clinical benefits of IMPT for left-side BC irradiation due to lower risk of cardiac and pulmonary morbidity. The applied approach might be relevant on the long term for the setup of cost-effectiveness evaluation strategies based on NTCP predictions.

  20. Specialist antenatal clinics for women at high risk of preterm birth: a systematic review of qualitative and quantitative research.

    PubMed

    Malouf, Reem; Redshaw, Maggie

    2017-02-02

    Preterm birth (PTB) is the leading cause of perinatal morbidity and mortality. Women with previous prenatal loss are at higher risk of preterm birth. A specialist antenatal clinic is considered as one approach to improve maternity and pregnancy outcomes. A systematic review of quantitative, qualitative and mixed method studies conducted on women at high risk of preterm birth (PTB). The review primary outcomes were to report on the specialist antenatal clinics effect in preventing or reducing preterm birth, perinatal mortality and morbidity and women's perceptions and experiences of a specialist clinic whether compared or not compared with standard antenatal care. Other secondary maternal, infant and economic outcomes were also determined. A comprehensive search strategy was carried out in English within electronic databases as far back as 1980. The reviewers selected studies, assessed the quality, and extracted data independently. Results were summarized and tabulated. Eleven studies fully met the review inclusion criteria, ten were quantitative design studies and only one was a qualitative design study. No mixed method design study was included in the review. All were published after 1989, seven were conducted in the USA and four in the UK. Results from five good to low quality randomised controlled trials (RCTs), all conducted before 1990, did not illustrate the efficacy of the clinic in reducing preterm birth. Whereas results from more recent low quality cohort studies showed some positive neonatal outcomes. Themes from one good quality qualitative study reflected on the emotional and psychological need to reduce anxiety and stress of women referred to such a clinic. Women expressed their negative emotional responses at being labelled as high risk and positive responses to being assessed and treated in the clinic. Women also reported that their partners were struggling to cope emotionally. Findings from this review were mixed. Evidence from cohort studies

  1. Comparison of recreational health risks associated with surfing and swimming in dry weather and post-storm conditions at Southern California beaches using quantitative microbial risk assessment (QMRA).

    PubMed

    Tseng, Linda Y; Jiang, Sunny C

    2012-05-01

    Southern California is an increasingly urbanized hotspot for surfing, thus it is of great interest to assess the human illness risks associated with this popular ocean recreational water sport from exposure to fecal bacteria contaminated coastal waters. Quantitative microbial risk assessments were applied to eight popular Southern California beaches using readily available enterococcus and fecal coliform data and dose-response models to compare health risks associated with surfing during dry weather and storm conditions. The results showed that the level of gastrointestinal illness risks from surfing post-storm events was elevated, with the probability of exceeding the US EPA health risk guideline up to 28% of the time. The surfing risk was also elevated in comparison with swimming at the same beach due to ingestion of greater volume of water. The study suggests that refinement of dose-response model, improving monitoring practice and better surfer behavior surveillance will improve the risk estimation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  4. Evaluation of risk communication in a mammography patient decision aid.

    PubMed

    Klein, Krystal A; Watson, Lindsey; Ash, Joan S; Eden, Karen B

    2016-07-01

    We characterized patients' comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest-posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Participants' positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Evaluation of risk communication in a mammography patient decision aid

    PubMed Central

    Klein, Krystal A.; Watson, Lindsey; Ash, Joan S.; Eden, Karen B.

    2016-01-01

    Objectives We characterized patients’ comprehension, memory, and impressions of risk communication messages in a patient decision aid (PtDA), Mammopad, and clarified perceived importance of numeric risk information in medical decision making. Methods Participants were 75 women in their forties with average risk factors for breast cancer. We used mixed methods, comprising a risk estimation problem administered within a pretest–posttest design, and semi-structured qualitative interviews with a subsample of 21 women. Results Participants’ positive predictive value estimates of screening mammography improved after using Mammopad. Although risk information was only briefly memorable, through content analysis, we identified themes describing why participants value quantitative risk information, and obstacles to understanding. We describe ways the most complicated graphic was incompletely comprehended. Conclusions Comprehension of risk information following Mammopad use could be improved. Patients valued receiving numeric statistical information, particularly in pictograph format. Obstacles to understanding risk information, including potential for confusion between statistics, should be identified and mitigated in PtDA design. Practice implications Using simple pictographs accompanied by text, PtDAs may enhance a shared decision-making discussion. PtDA designers and providers should be aware of benefits and limitations of graphical risk presentations. Incorporating comprehension checks could help identify and correct misapprehensions of graphically presented statistics PMID:26965020

  6. La Conchita Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Kropp, A.; Johnson, L.; Magnusen, W.; Hitchcock, C. S.

    2009-12-01

    Following the disastrous landslide in La Conchita in 2005 that resulted in ten deaths, the State of California selected our team to prepare a risk assessment for a committee of key stakeholders. The stakeholders represented the State of California, Ventura County, members of the La Conchita community, the railroad, and the upslope ranch owner (where the slide originated); a group with widely varying views and interests. Our team was charged with characterizing the major hazards, developing a series of mitigation concepts, evaluating the benefits and costs of mitigation, and gathering stakeholder input throughout the process. Two unique elements of the study were the methodologies utilized for the consequence assessment and for the decision-making framework. La Conchita is exposed to multiple slope hazards, each with differing geographical distributions, as well as depth and velocity characteristics. Three consequence matrices were developed so that the potential financial losses, structural vulnerabilities, and human safety exposure could be evaluated. The matrices utilized semi-quantitative loss evaluations (both financial and life safety) based on a generalized understanding of likely vulnerability and hazard characteristics. The model provided a quantitative estimate of cumulative losses over a 50-year period, including losses of life based on FEMA evaluation criteria. Conceptual mitigation options and loss estimates were developed to provide a range of risk management solutions that were feasible from a cost-benefit standpoint. A decision tree approach was adopted to focus on fundamental risk management questions rather than on specific outcomes since the committee did not have a consensus view on the preferred solution. These questions included: 1. Over what time period can risks be tolerated before implementation of decisions? 2. Whose responsibility is it to identify a workable risk management solution? 3. Who will own the project? The decision tree

  7. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  8. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  9. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  10. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  11. [The concept of risk and its estimation].

    PubMed

    Zocchetti, C; Della Foglia, M; Colombi, A

    1996-01-01

    The concept of risk, in relation to human health, is a topic of primary interest for occupational health professionals. A new legislation recently established in Italy (626/94) according to European Community directives in the field of Preventive Medicine, called attention to this topic, and in particular to risk assessment and evaluation. Motivated by this context and by the impression that the concept of risk is frequently misunderstood, the present paper has two aims: the identification of the different meanings of the term "risk" in the new Italian legislation and the critical discussion of some commonly used definitions; and the proposal of a general definition, with the specification of a mathematical expression for quantitative risk estimation. The term risk (and risk estimation, assessment, or evaluation) has mainly referred to three different contexts: hazard identification, exposure assessment, and adverse health effects occurrence. Unfortunately, there are contexts in the legislation in which it is difficult to identify the true meaning of the term. This might cause equivocal interpretations and erroneous applications of the law because hazard evaluation, exposure assessment, and adverse health effects identification are completely different topics that require integrated but distinct approaches to risk management. As far as a quantitative definition of risk is of concern, we suggest an algorithm which connects the three basic risk elements (hazard, exposure, adverse health effects) by means of their probabilities of occurrence: the probability of being exposed (to a definite dose) given that a specific hazard is present (Pr(e[symbol: see text]p)), and the probability of occurrence of an adverse health effect as a consequence of that exposure (Pr(d[symbol: see text]e)). Using these quantitative components, risk can be defined as a sequence of measurable events that starts with hazard identification and terminates with disease occurrence; therefore, the

  12. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  13. Presenting risk information to people with diabetes: evaluating effects and preferences for different formats by a web-based randomised controlled trial.

    PubMed

    Edwards, Adrian; Thomas, Richard; Williams, Rhys; Ellner, Andrew L; Brown, Polly; Elwyn, Glyn

    2006-11-01

    Web-based patient information is widespread and information on the benefits and risks of treatments is often difficult to understand. We therefore evaluated different risk presentation formats - numerical, graphical and others - addressing the pros and cons of tight control versus usual treatment approaches for diabetes. Randomised controlled trial. Online. Publicity disseminated via Diabetes UK. People with diabetes or their carers. Control group information based on British Medical Journal 'Best Treatments'. Four intervention groups received enhanced information resources: (1) detailed numerical information (absolute/relative risk, numbers-needed-to-treat); (2) 'anchoring' to familiar risks or descriptions; (3) graphical (bar charts, thermometer scales, crowd figure formats); (4) combination of 1-3. Decision conflict scale (DCS, a measure of uncertainty); satisfaction with information; further free text responses for qualitative content analysis. Seven hundred and ten people visited the website and were randomised. Five hundred and eight completed the questionnaire for quantitative data. Mean DCS scores ranged from 2.12 to 2.24 for the five randomisation groups, indicating neither clear delay or vacillation about decisions (usually DCS>2.5) nor tending to make decisions (usually DCS<2.0). There were no statistically significant effects of the interventions on DCS, or satisfaction with information. Two hundred and fifty-six participants provided responses for qualitative analysis: most found graphical representations helpful, specifically bar chart formats; many found other graphic formats (thermometer style, crowd figures/smiley faces) and 'anchoring' information unhelpful, and indicated information overload. Many negative experiences with healthcare indicate a challenging context for effective information provision and decision support. Online evaluation of different risk representation formats was feasible. There was a lack of intervention effects on

  14. Assessment of three risk evaluation systems for patients aged ≥70 in East China: performance of SinoSCORE, EuroSCORE II and the STS risk evaluation system.

    PubMed

    Shan, Lingtong; Ge, Wen; Pu, Yiwei; Cheng, Hong; Cang, Zhengqiang; Zhang, Xing; Li, Qifan; Xu, Anyang; Wang, Qi; Gu, Chang; Zhang, Yangyang

    2018-01-01

    To assess and compare the predictive ability of three risk evaluation systems (SinoSCORE, EuroSCORE II and the STS risk evaluation system) in patients aged ≥70, and who underwent coronary artery bypass grafting (CABG) in East China. Three risk evaluation systems were applied to 1,946 consecutive patients who underwent isolated CABG from January 2004 to September 2016 in two hospitals. Patients were divided into two subsets according to their age: elderly group (age ≥70) with a younger group (age <70) used for comparison. The outcome of interest in this study was in-hospital mortality. The entire cohort and subsets of patients were analyzed. The calibration and discrimination in total and in subsets were assessed by the Hosmer-Lemeshow and the C statistics respectively. Institutional overall mortality was 2.52%. The expected mortality rates of SinoSCORE, EuroSCORE II and the STS risk evaluation system were 0.78(0.64)%, 1.43(1.14)% and 0.78(0.77)%, respectively. SinoSCORE achieved the best discrimination (the area under the receiver operating characteristic curve (AUC) = 0.829), followed by the STS risk evaluation system (AUC = 0.790) and EuroSCORE II (AUC = 0.769) in the entire cohort. In the elderly group, the observed mortality rate was 4.82% while it was 1.38% in the younger group. SinoSCORE (AUC = .829) also achieved the best discrimination in the elderly group, followed by the STS risk evaluation system (AUC = .730) and EuroSCORE II (AUC = 0.640) while all three risk evaluation systems all had good performances in the younger group. SinoSCORE, EuroSCORE II and the STS risk evaluation system all achieved positive calibrations in the entire cohort and subsets. The performance of the three risk evaluation systems was not ideal in the entire cohort. In the elderly group, SinoSCORE appeared to achieve better predictive efficiency than EuroSCORE II and the STS risk evaluation system.

  15. Detection of hemoplasma infection of goats by use of a quantitative polymerase chain reaction assay and risk factor analysis for infection.

    PubMed

    Johnson, Kathy A; do Nascimento, Naíla C; Bauer, Amy E; Weng, Hsin-Yi; Hammac, G Kenitra; Messick, Joanne B

    2016-08-01

    OBJECTIVE To develop and validate a real-time quantitative PCR (qPCR) assay for the detection and quantification of Mycoplasma ovis in goats and investigate the prevalence and risk factors for hemoplasma infection of goats located in Indiana. ANIMALS 362 adult female goats on 61 farms. PROCEDURES Primers were designed for amplification of a fragment of the dnaK gene of M ovis by use of a qPCR assay. Blood samples were collected into EDTA-containing tubes for use in total DNA extraction, blood film evaluation, and determination of PCV. Limit of detection, intra-assay variability, interassay variability, and specificity of the assay were determined. RESULTS Reaction efficiency of the qPCR assay was 94.45% (R(2), 0.99; slope, -3.4623), and the assay consistently detected as few as 10 copies of plasmid/reaction. Prevalence of infection in goats on the basis of results for the qPCR assay was 18.0% (95% confidence interval, 14% to 22%), with infected goats ranging from 1 to 14 years old, whereby 61% (95% confidence interval, 47% to 73%) of the farms had at least 1 infected goat. Bacterial load in goats infected with M ovis ranged from 1.05 × 10(3) target copies/mL of blood to 1.85 × 10(5) target copies/mL of blood; however, no bacteria were observed on blood films. Production use of a goat was the only risk factor significantly associated with hemoplasma infection. CONCLUSIONS AND CLINICAL RELEVANCE The qPCR assay was more sensitive for detecting hemoplasma infection than was evaluation of a blood film, and production use of a goat was a risk factor for infection.

  16. Evaluation of reference genes for quantitative RT-PCR in Lolium temulentum under abiotic stress

    USDA-ARS?s Scientific Manuscript database

    Lolium temulentum is a valuable model grass species for the study of stress in forage and turf grasses. Gene expression analysis by quantitative real time RT-PCR relies on the use of proper internal standards. The aim of this study was to identify and evaluate reference genes for use in real-time q...

  17. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  18. A clustering approach to segmenting users of internet-based risk calculators.

    PubMed

    Harle, C A; Downs, J S; Padman, R

    2011-01-01

    Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.

  19. Epidemiological survey of quantitative ultrasound in risk assessment of falls in middle-aged and elderly people.

    PubMed

    Ou, Ling-Chun; Sun, Zih-Jie; Chang, Yin-Fan; Chang, Chin-Sung; Chao, Ting-Hsing; Kuo, Po-Hsiu; Lin, Ruey-Mo; Wu, Chih-Hsing

    2013-01-01

    The risk assessment of falls is important, but still unsatisfactory and time-consuming. Our objective was to assess quantitative ultrasound (QUS) in the risk assessment of falls. Our study was designed as epidemiological cross-sectional study occurring from March 2009 to February 2010 by community survey at a medical center. The participants were collected from systemic sample of 1,200 community-dwelling people (Male/Female = 524/676) 40 years old and over in Yunlin County, Mid-Taiwan. Structural questionnaires including socioeconomic status, living status, smoking and drinking habits, exercise and medical history were completed. Quantitative ultrasound (QUS) at the non-dominant distal radial area (QUS-R) and the left calcaneal area (QUS-C) were measured. The overall prevalence of falls was 19.8%. In men, the independently associated factors for falls were age (OR: 1.04; 95%CI: 1.01~1.06), fracture history (OR: 1.89; 95%CI: 1.12~3.19), osteoarthritis history (OR: 3.66; 95%CI: 1.15~11.64) and speed of sound (OR: 0.99; 95%CI: 0.99~1.00; p<0.05) by QUS-R. In women, the independently associated factors for falls were current drinking (OR: 3.54; 95%CI: 1.35∼9.31) and broadband ultrasound attenuation (OR: 0.98; 95%CI: 0.97~0.99; p<0.01) by QUS-C. The cutoffs at -2.5< T-score<-1 derived using QUS-R (OR: 2.85; 95%CI: 1.64~4.96; p<0.01) in men or T-score ≦-2.5 derived using QUS-C (OR: 2.72; 95%CI: 1.42~5.21; p<0.01) in women showed an independent association with falls. The lowest T-score derived using either QUS-R or QUS-C was also revealed as an independent factor for falls in both men (OR: 2.13; 95%CI: 1.03~4.43; p<0.05) and women (OR: 2.36; 95%CI: 1.13~4.91; p<0.05). Quantitative ultrasounds, measured either at the radial or calcaneal area, are convenient tools by which to assess the risk of falls in middle-aged and elderly people.

  20. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico.

  1. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  2. RISK MANAGEMENT EVALUATION FOR CONCENTRATED ANIMAL FEEDING OPERATIONS

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) developed a Risk Management Evaluation (RME) to provide information needed to help plan future research in the Laboratory dealing with the environmental impact of concentrated animal feeding operations (CAFOs). Agriculture...

  3. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  4. Risk evaluation of highway engineering project based on the fuzzy-AHP

    NASA Astrophysics Data System (ADS)

    Yang, Qian; Wei, Yajun

    2011-10-01

    Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.

  5. Quantitative Evaluation of Musical Scale Tunings

    ERIC Educational Resources Information Center

    Hall, Donald E.

    1974-01-01

    The acoustical and mathematical basis of the problem of tuning the twelve-tone chromatic scale is reviewed. A quantitative measurement showing how well any tuning succeeds in providing just intonation for any specific piece of music is explained and applied to musical examples using a simple computer program. (DT)

  6. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  7. Prediction of Emergent Heart Failure Death by Semi-Quantitative Triage Risk Stratification

    PubMed Central

    Van Spall, Harriette G. C.; Atzema, Clare; Schull, Michael J.; Newton, Gary E.; Mak, Susanna; Chong, Alice; Tu, Jack V.; Stukel, Thérèse A.; Lee, Douglas S.

    2011-01-01

    Objectives Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients. Methods We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes. Results Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001). Conclusions A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients. PMID:21853068

  8. Evaluation of Quantitative Literacy Series: Exploring Data and Exploring Probability. Program Report 87-5.

    ERIC Educational Resources Information Center

    Day, Roger P.; And Others

    A quasi-experimental design with two experimental groups and one control group was used to evaluate the use of two books in the Quantitative Literacy Series, "Exploring Data" and "Exploring Probability." Group X teachers were those who had attended a workshop on the use of the materials and were using the materials during the…

  9. Qualitative and quantitative evaluation of some vocal function parameters following fitting of a prosthesis.

    PubMed

    Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G

    2001-12-01

    The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.

  10. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  11. Quantitative ecological risk assessment of inhabitants exposed to polycyclic aromatic hydrocarbons in terrestrial soils of King George Island, Antarctica

    NASA Astrophysics Data System (ADS)

    Pongpiachan, S.; Hattayanone, M.; Pinyakong, O.; Viyakarn, V.; Chavanich, S. A.; Bo, C.; Khumsup, C.; Kittikoon, I.; Hirunyatrakul, P.

    2017-03-01

    This study aims to conduct a quantitative ecological risk assessment of human exposure to polycyclic aromatic hydrocarbons (PAHs) in terrestrial soils of King George Island, Antarctica. Generally, the average PAH concentrations detected in King George Terrestrial Soils (KGS) were appreciably lower than those of World Marine Sediments (WMS) and World Terrestrial Soils (WTS), highlighting the fact that Antarctica is one of the most pristine continents in the world. The total concentrations of twelve probably carcinogenic PAHs (ΣPAHs: a sum of Phe, An, Fluo, Pyr, B[a]A, Chry, B[b]F, B[k]F, B[a]P, Ind, D[a,h]A and B[g,h,i]P) were 3.21 ± 1.62 ng g-1, 5749 ± 4576 ng g-1, and 257,496 ± 291,268 ng g-1, for KGS, WMS and WTS, respectively. In spite of the fact that KGS has extremely low ΣPAHs in comparison with others, the percentage contribution of Phe is exceedingly high with the value of 50%. By assuming that incidental ingestion and dermal contact are two major exposure pathways responsible for the adverse human health effects, the cancer and non-cancer risks from environmental exposure to PAHs were carefully evaluated based on the ;Role of the Baseline Risk Assessment in Superfund Remedy Selection Decisions; memorandum provided by US-EPA. The logarithms of cancer risk levels of PAH contents in KGS varied from -11.1 to -7.18 with an average of -7.96 ± 7.73, which is 1790 times and 80,176 times lower than that of WMS and WTS, respectively. All cancer risk levels of PAH concentrations observed in KGS are significantly (p < 0.001) lower than those of WMS and WTS. Despite the Comandante Ferraz Antarctic Station fire occurred in February 25th, 2012, both the cancer and non-cancer risks of environmental exposure to PAHs were found in ;acceptable level;.

  12. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  13. [Guidance of FDA risk evaluation and mitigation strategy and enlightenment to drug risk management of post-marketing Chinese medicine].

    PubMed

    Li, Yuanyuan; Xie, Yanming

    2011-10-01

    The FDA risk evaluation and mitigation strategy (REMS) aims to drugs or biological products known or potential serious risk management. Analysis with the example of the content of the Onsolis REMS named FOCOS. Our country can be reference for the analysis of relevant experience and establish a scientific evaluation mechanism, strengthen the drug risk consciousness, promote the rational drug use, organic combined with the before-marketing and post-marketing evaluation of traditional Chinese medicine, and promote the evaluation of risk management of the drug development and improvement.

  14. Diffusion tensor imaging with quantitative evaluation and fiber tractography of lumbar nerve roots in sciatica.

    PubMed

    Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang

    2015-04-01

    To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Processes for Risk Evaluation and Chemical Prioritization for Risk Evaluation under the Amended Toxic Substances Control Act; Notice of Public Meetings and Opportunities for Public Comment

    EPA Pesticide Factsheets

    This notice provides information for two public meetings to obtain input into the Agency’s development of processes for risk evaluation and chemical prioritization for risk evaluation under amended TSCA.

  16. Evaluation of a constipation risk assessment scale.

    PubMed

    Zernike, W; Henderson, A

    1999-06-01

    This project was undertaken in order to evaluate the utility of a constipation risk assessment scale and the accompanying bowel management protocol. The risk assessment scale was primarily introduced to teach and guide staff in managing constipation when caring for patients. The intention of the project was to reduce the incidence of constipation in patients during their admission to hospital.

  17. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  18. A Quantitative ADME-base Tool for Exploring Human ...

    EPA Pesticide Factsheets

    Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It

  19. Quantitative Microbial Risk Assessment Tutorial: HSPF Setup, Application, and Calibration of Flows and Microbial Fate and Transport on an Example Watershed

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) infrastructure that automates the manual process of characterizing transport of pathogens and microorganisms, from the source of release to a point of exposure, has been developed by loosely configuring a set of modules and process-...

  20. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  1. Quantitative evaluation of learning and memory trace in studies of mnemotropic effects of immunotropic drugs.

    PubMed

    Kiseleva, N M; Novoseletskaya, A V; Voevodina, Ye B; Kozlov, I G; Inozemtsev, A N

    2012-12-01

    Apart from restoration of disordered immunological parameters, tactivin and derinat exhibit a pronounced effect on the higher integrative functions of the brain. Experiments on Wistar rats have shown that these drugs accelerated conditioning of food and defense responses. New methods for quantitative evaluation of memory trace consolidation are proposed.

  2. Skin sensitization quantitative risk assessment for occupational exposure of hairdressers to hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Blömeke, Brunhilde; Gaspari, Anthony A; Schnuch, Axel; Fuchs, Anne; Schlotmann, Kordula; Krasteva, Maya; Kimber, Ian

    2018-06-01

    Occupational exposure of hairdressers to hair dyes has been associated with the development of allergic contact dermatitis (ACD) involving the hands. p-Phenylenediamine (PPD) and toluene-2,5-diamine (PTD) have been implicated as important occupational contact allergens. To conduct a quantitative risk assessment for the induction of contact sensitization to hair dyes in hairdressers, available data from hand rinsing studies following typical occupational exposure conditions to PPD, PTD and resorcinol were assessed. By accounting for wet work, uneven exposure and inter-individual variability for professionals, daily hand exposure concentrations were derived. Secondly, daily hand exposure was compared with the sensitization induction potency of the individual hair dye defined as the No Expected Sensitization Induction Levels (NESIL). For PPD and PTD hairdresser hand exposure levels were 2.7 and 5.9 fold below the individual NESIL. In contrast, hand exposure to resorcinol was 50 fold below the NESIL. Correspondingly, the risk assessment for PPD and PTD indicates that contact sensitization may occur, when skin protection and skin care are not rigorously applied. We conclude that awareness of health risks associated with occupational exposure to hair dyes, and of the importance of adequate protective measures, should be emphasized more fully during hairdresser education and training. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    PubMed

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  4. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  5. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  6. Quantitative Microbial Risk Assessment Models for Consumption of Raw Vegetables Irrigated with Reclaimed Water

    PubMed Central

    Hamilton, Andrew J.; Stagnitti, Frank; Premier, Robert; Boland, Anne-Maree; Hale, Glenn

    2006-01-01

    Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10−3 to 10−1 when reclaimed-water irrigation ceased 1 day before harvest and from 10−9 to 10−3 when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of ≤10−4, i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10−4 standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food preparation

  7. A combined pulmonary-radiology workshop for visual evaluation of COPD: study design, chest CT findings and concordance with quantitative evaluation.

    PubMed

    Barr, R Graham; Berkowitz, Eugene A; Bigazzi, Francesca; Bode, Frederick; Bon, Jessica; Bowler, Russell P; Chiles, Caroline; Crapo, James D; Criner, Gerard J; Curtis, Jeffrey L; Dass, Chandra; Dirksen, Asger; Dransfield, Mark T; Edula, Goutham; Erikkson, Leif; Friedlander, Adam; Galperin-Aizenberg, Maya; Gefter, Warren B; Gierada, David S; Grenier, Philippe A; Goldin, Jonathan; Han, MeiLan K; Hanania, Nicola A; Hansel, Nadia N; Jacobson, Francine L; Kauczor, Hans-Ulrich; Kinnula, Vuokko L; Lipson, David A; Lynch, David A; MacNee, William; Make, Barry J; Mamary, A James; Mann, Howard; Marchetti, Nathaniel; Mascalchi, Mario; McLennan, Geoffrey; Murphy, James R; Naidich, David; Nath, Hrudaya; Newell, John D; Pistolesi, Massimo; Regan, Elizabeth A; Reilly, John J; Sandhaus, Robert; Schroeder, Joyce D; Sciurba, Frank; Shaker, Saher; Sharafkhaneh, Amir; Silverman, Edwin K; Steiner, Robert M; Strange, Charlton; Sverzellati, Nicola; Tashjian, Joseph H; van Beek, Edwin J R; Washington, Lacey; Washko, George R; Westney, Gloria; Wood, Susan A; Woodruff, Prescott G

    2012-04-01

    The purposes of this study were: to describe chest CT findings in normal non-smoking controls and cigarette smokers with and without COPD; to compare the prevalence of CT abnormalities with severity of COPD; and to evaluate concordance between visual and quantitative chest CT (QCT) scoring. Volumetric inspiratory and expiratory CT scans of 294 subjects, including normal non-smokers, smokers without COPD, and smokers with GOLD Stage I-IV COPD, were scored at a multi-reader workshop using a standardized worksheet. There were 58 observers (33 pulmonologists, 25 radiologists); each scan was scored by 9-11 observers. Interobserver agreement was calculated using kappa statistic. Median score of visual observations was compared with QCT measurements. Interobserver agreement was moderate for the presence or absence of emphysema and for the presence of panlobular emphysema; fair for the presence of centrilobular, paraseptal, and bullous emphysema subtypes and for the presence of bronchial wall thickening; and poor for gas trapping, centrilobular nodularity, mosaic attenuation, and bronchial dilation. Agreement was similar for radiologists and pulmonologists. The prevalence on CT readings of most abnormalities (e.g. emphysema, bronchial wall thickening, mosaic attenuation, expiratory gas trapping) increased significantly with greater COPD severity, while the prevalence of centrilobular nodularity decreased. Concordances between visual scoring and quantitative scoring of emphysema, gas trapping and airway wall thickening were 75%, 87% and 65%, respectively. Despite substantial inter-observer variation, visual assessment of chest CT scans in cigarette smokers provides information regarding lung disease severity; visual scoring may be complementary to quantitative evaluation.

  8. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    USDA-ARS?s Scientific Manuscript database

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  9. Risk assessment techniques with applicability in marine engineering

    NASA Astrophysics Data System (ADS)

    Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.

    2015-11-01

    Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.

  10. Quantitative Risk - Phase 1

    DTIC Science & Technology

    2013-05-29

    Deshmukh ,  2010),  p.  128].     1. RISK  MANAGEMENT  IS  MANY...final  report  of  a  previous  SERC  research  topic,  valuing  flexibility  (RT-­‐18),  is  dispositive  ( Deshmukh ...Hall.       Beer,  S.  (1979).  The  heart  of  the  enterprise.  New  York:  Wiley.       Deshmukh ,

  11. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration. © 2016 Society for Risk Analysis.

  12. Interactive graphics to demonstrate health risks: formative development and qualitative evaluation

    PubMed Central

    Ancker, Jessica S.; Chan, Connie; Kukafka, Rita

    2015-01-01

    Background Recent findings suggest that interactive game-like graphics might be useful in communicating probabilities. We developed a prototype for a risk communication module, focusing on eliciting users’ preferences for different interactive graphics and assessing usability and user interpretations. Methods Focus groups and iterative design methods. Results Feedback from five focus groups was used to design the graphics. The final version displayed a matrix of square buttons; clicking on any button allowed the user to see whether the stick figure underneath was affected by the health outcome. When participants used this interaction to learn about a risk, they expressed more emotional responses, both positive and negative, than when viewing any static graphic or numerical description of a risk. Their responses included relief about small risks and concern about large risks. The groups also commented on static graphics: arranging the figures affected by disease randomly throughout a group of figures made it more difficult to judge the proportion affected but was described as more realistic. Conclusions Interactive graphics appear to have potential for expressing risk magnitude as well as the affective feeling of risk. Quantitative studies are planned to assess the effect on perceived risks and estimated risk magnitudes. PMID:19657926

  13. Development and Justification of a Risk Evaluation Matrix To Guide Chemical Testing Necessary To Select and Qualify Plastic Components Used in Production Systems for Pharmaceutical Products.

    PubMed

    Jenke, Dennis

    2015-01-01

    An accelerating trend in the pharmaceutical industry is the use of plastic components in systems used to produce an active pharmaceutical ingredient or a finished drug product. If the active pharmaceutical ingredient, the finished drug product, or any solution used to generate them (for example, a process stream such as media, buffers, eluents, and the like) is contacted by a plastic component at any time during the production process, substances leached from the component may accumulate in the active pharmaceutical ingredient or finished drug product, affecting its safety and/or efficacy. In this article the author develops and justifies a semi-quantitative risk evaluation matrix that is used to determine the amount and rigor of component testing necessary and appropriate to establish that the component is chemically suitable for its intended use. By considering key properties of the component, the contact medium, the contact conditions, and the active pharmaceutical ingredient's or finished drug product's clinical conditions of use, use of the risk evaluation matrix produces a risk score whose magnitude reflects the accumulated risk that the component will interact with the contact solution to such an extent that component-related extractables will accumulate in the active pharmaceutical ingredient or finished drug product as leachables at levels sufficiently high to adversely affect user safety. The magnitude of the risk score establishes the amount and rigor of the testing that is required to select and qualify the component, and such testing is broadly grouped into three categories: baseline assessment, general testing, and full testing (extractables profiling). Production suites used to generate pharmaceuticals can include plastic components. It is possible that substances in the components could leach into manufacturing solutions and accumulate in the pharmaceutical product. In this article the author develops and justifies a semi-quantitative risk

  14. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  15. Quantitative image quality evaluation of MR images using perceptual difference models

    PubMed Central

    Miao, Jun; Huo, Donglai; Wilson, David L.

    2008-01-01

    The authors are using a perceptual difference model (Case-PDM) to quantitatively evaluate image quality of the thousands of test images which can be created when optimizing fast magnetic resonance (MR) imaging strategies and reconstruction techniques. In this validation study, they compared human evaluation of MR images from multiple organs and from multiple image reconstruction algorithms to Case-PDM and similar models. The authors found that Case-PDM compared very favorably to human observers in double-stimulus continuous-quality scale and functional measurement theory studies over a large range of image quality. The Case-PDM threshold for nonperceptible differences in a 2-alternative forced choice study varied with the type of image under study, but was ≈1.1 for diffuse image effects, providing a rule of thumb. Ordering the image quality evaluation models, we found in overall Case-PDM ≈ IDM (Sarnoff Corporation) ≈ SSIM [Wang et al. IEEE Trans. Image Process. 13, 600–612 (2004)] > mean squared error ≈ NR [Wang et al. (2004) (unpublished)] > DCTune (NASA) > IQM (MITRE Corporation). The authors conclude that Case-PDM is very useful in MR image evaluation but that one should probably restrict studies to similar images and similar processing, normally not a limitation in image reconstruction studies. PMID:18649487

  16. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  17. Clinical evaluation of a new pressure ulcer risk assessment instrument, the Pressure Ulcer Risk Primary or Secondary Evaluation Tool (PURPOSE T).

    PubMed

    Coleman, Susanne; Smith, Isabelle L; McGinnis, Elizabeth; Keen, Justin; Muir, Delia; Wilson, Lyn; Stubbs, Nikki; Dealey, Carol; Brown, Sarah; Nelson, E Andrea; Nixon, Jane

    2018-02-01

    To test the psychometric properties and clinical usability of a new Pressure Ulcer Risk Assessment Instrument including inter-rater and test-retest reliability, convergent validity and data completeness. Methodological and practical limitations associated with traditional Pressure Ulcer Risk Assessment Instruments, prompted a programme to work to develop a new instrument, as part of the National Institute for Health Research funded, Pressure UlceR Programme Of reSEarch (RP-PG-0407-10056). Observational field test. For this clinical evaluation 230 patients were purposefully sampled across four broad levels of pressure ulcer risk with representation from four secondary care and four community NHS Trusts in England. Blinded and simultaneous paired (ward/community nurse and expert nurse) PURPOSE-T assessments were undertaken. Follow-up retest was undertaken by the expert nurse. Field notes of PURPOSE-T use were collected. Data were collected October 2012-January 2013. The clinical evaluation demonstrated "very good" (kappa) inter-rater and test-retest agreement for PURPOSE-T assessment decision overall. The percentage agreement for "problem/no problem" was over 75% for the main risk factors. Convergent validity demonstrated moderate to high associations with other measures of similar constructs. The PURPOSE-T evaluation facilitated the initial validation and clinical usability of the instrument and demonstrated that PURPOSE-T is suitable of use in clinical practice. Further study is needed to evaluate the impact of using the instrument on care processes and outcomes. © 2017 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  18. Blood color is influenced by inflammation and independently predicts survival in hemodialysis patients: quantitative evaluation of blood color.

    PubMed

    Shibata, Masanori; Nagai, Kojiro; Doi, Toshio; Tawada, Hideo; Taniguchi, Shinkichi

    2012-11-01

    Blood color of dialysis patients can be seen routinely. Darkened blood color is often observed in critically ill patients generally because of decreased oxygen saturation, but little is known about the other factors responsible for the color intensity. In addition, quantitative blood color examination has not been performed yet. Therefore, no one has evaluated the predictive power of blood color. The aim of this study was to evaluate if blood color darkness reflects some medical problems and is associated with survival disadvantage. Study design is a prospective cohort study. One hundred sixty-seven patients were enrolled in this study. Quantification of blood color was done using a reflected light colorimeter. Demographic and clinical data were collected to find out the factors that can be related to blood color. Follow-ups were performed for 2 years to analyze the risk factors for their survival. Regression analysis showed that C-reactive protein and white blood cell count were negatively correlated with blood color. In addition, blood color was positively correlated with mean corpuscular hemoglobin concentration and serum sodium concentration as well as blood oxygen saturation. During a follow-up, 34 (20.4%) patients died. Cox regression analysis revealed that darkened blood color was an independent significant risk factor of mortality in hemodialysis patients as well as low albumin and low Kt/V. These results suggest that inflammation independently affects blood color and quantification of blood color is useful to estimate prognosis in patients undergoing hemodialysis. It is possible that early detection of blood color worsening can improve patients' survival. © 2012, Copyright the Authors. Artificial Organs © 2012, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  19. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆

    PubMed Central

    Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-01-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced

  20. A quantitative approach to evaluating caring in nursing simulation.

    PubMed

    Eggenberger, Terry L; Keller, Kathryn B; Chase, Susan K; Payne, Linda

    2012-01-01

    This study was designed to test a quantitative method of measuring caring in the simulated environment. Since competency in caring is central to nursing practice, ways of including caring concepts in designing scenarios and in evaluation of performance need to be developed. Coates' Caring Efficacy scales were adapted for simulation and named the Caring Efficacy Scale-Simulation Student Version (CES-SSV) and Caring Efficacy Scale-Simulation Faculty Version (CES-SFV). A correlational study was designed to compare student self-ratings with faculty ratings on caring efficacy during an adult acute simulation experience with traditional and accelerated baccalaureate students in a nursing program grounded in caring theory. Student self-ratings were significantly correlated with objective ratings (r = 0.345, 0.356). Both the CES-SSV and the CES-SFV were found to have excellent internal consistency and significantly correlated interrater reliability. They were useful in measuring caring in the simulated learning environment.

  1. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  2. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    PubMed

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  3. Risk assessment and remedial policy evaluation using predictive modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linkov, L.; Schell, W.R.

    1996-06-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forestmore » compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment.« less

  4. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  5. Proposal of a risk-factor-based analytical approach for integrating occupational health and safety into project risk evaluation.

    PubMed

    Badri, Adel; Nadeau, Sylvie; Gbodossou, André

    2012-09-01

    Excluding occupational health and safety (OHS) from project management is no longer acceptable. Numerous industrial accidents have exposed the ineffectiveness of conventional risk evaluation methods as well as negligence of risk factors having major impact on the health and safety of workers and nearby residents. Lack of reliable and complete evaluations from the beginning of a project generates bad decisions that could end up threatening the very existence of an organization. This article supports a systematic approach to the evaluation of OHS risks and proposes a new procedure based on the number of risk factors identified and their relative significance. A new concept called risk factor concentration along with weighting of risk factor categories as contributors to undesirable events are used in the analytical hierarchy process multi-criteria comparison model with Expert Choice(©) software. A case study is used to illustrate the various steps of the risk evaluation approach and the quick and simple integration of OHS at an early stage of a project. The approach allows continual reassessment of criteria over the course of the project or when new data are acquired. It was thus possible to differentiate the OHS risks from the risk of drop in quality in the case of the factory expansion project. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Evaluation of patients with painful total hip arthroplasty using combined single photon emission tomography and conventional computerized tomography (SPECT/CT) - a comparison of semi-quantitative versus 3D volumetric quantitative measurements.

    PubMed

    Barthassat, Emilienne; Afifi, Faik; Konala, Praveen; Rasch, Helmut; Hirschmann, Michael T

    2017-05-08

    It was the primary purpose of our study to evaluate the inter- and intra-observer reliability of a standardized SPECT/CT algorithm for evaluating patients with painful primary total hip arthroplasty (THA). The secondary purpose was a comparison of semi-quantitative and 3D volumetric quantification method for assessment of bone tracer uptake (BTU) in those patients. A novel SPECT/CT localization scheme consisting of 14 femoral and 4 acetabular regions on standardized axial and coronal slices was introduced and evaluated in terms of inter- and intra-observer reliability in 37 consecutive patients with hip pain after THA. BTU for each anatomical region was assessed semi-quantitatively using a color-coded Likert type scale (0-10) and volumetrically quantified using a validated software. Two observers interpreted the SPECT/CT findings in all patients two times with six weeks interval between interpretations in random order. Semi-quantitative and quantitative measurements were compared in terms of reliability. In addition, the values were correlated using Pearson`s correlation. A factorial cluster analysis of BTU was performed to identify clinically relevant regions, which should be grouped and analysed together. The localization scheme showed high inter- and intra-observer reliabilities for all femoral and acetabular regions independent of the measurement method used (semiquantitative versus 3D volumetric quantitative measurements). A high to moderate correlation between both measurement methods was shown for the distal femur, the proximal femur and the acetabular cup. The factorial cluster analysis showed that the anatomical regions might be summarized into three distinct anatomical regions. These were the proximal femur, the distal femur and the acetabular cup region. The SPECT/CT algorithm for assessment of patients with pain after THA is highly reliable independent from the measurement method used. Three clinically relevant anatomical regions (proximal femoral

  7. Evaluating the risks of clinical research: direct comparative analysis.

    PubMed

    Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David

    2014-09-01

    Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.

  8. A quantitative evaluation of cell migration by the phagokinetic track motility assay.

    PubMed

    Nogalski, Maciej T; Chan, Gary C T; Stevenson, Emily V; Collins-McMillen, Donna K; Yurochko, Andrew D

    2012-12-04

    Cellular motility is an important biological process for both unicellular and multicellular organisms. It is essential for movement of unicellular organisms towards a source of nutrients or away from unsuitable conditions, as well as in multicellular organisms for tissue development, immune surveillance and wound healing, just to mention a few roles(1,2,3). Deregulation of this process can lead to serious neurological, cardiovascular and immunological diseases, as well as exacerbated tumor formation and spread(4,5). Molecularly, actin polymerization and receptor recycling have been shown to play important roles in creating cellular extensions (lamellipodia), that drive the forward movement of the cell(6,7,8). However, many biological questions about cell migration remain unanswered. The central role for cellular motility in human health and disease underlines the importance of understanding the specific mechanisms involved in this process and makes accurate methods for evaluating cell motility particularly important. Microscopes are usually used to visualize the movement of cells. However, cells move rather slowly, making the quantitative measurement of cell migration a resource-consuming process requiring expensive cameras and software to create quantitative time-lapsed movies of motile cells. Therefore, the ability to perform a quantitative measurement of cell migration that is cost-effective, non-laborious, and that utilizes common laboratory equipment is a great need for many researchers. The phagokinetic track motility assay utilizes the ability of a moving cell to clear gold particles from its path to create a measurable track on a colloidal gold-coated glass coverslip(9,10). With the use of freely available software, multiple tracks can be evaluated for each treatment to accomplish statistical requirements. The assay can be utilized to assess motility of many cell types, such as cancer cells(11,12), fibroblasts(9), neutrophils(13), skeletal muscle cells(14

  9. Joint and separate evaluation of risk reduction: impact on sensitivity to risk reduction magnitude in the context of 4 different risk information formats.

    PubMed

    Gyrd-Hansen, Dorte; Halvorsen, Peder; Nexøe, Jørgen; Nielsen, Jesper; Støvring, Henrik; Kristiansen, Ivar

    2011-01-01

    When people make choices, they may have multiple options presented simultaneously or, alternatively, have options presented 1 at a time. It has been shown that if decision makers have little experience with or difficulties in understanding certain attributes, these attributes will have greater impact in joint evaluations than in separate evaluations. The authors investigated the impact of separate versus joint evaluations in a health care context in which laypeople were presented with the possibility of participating in risk-reducing drug therapies. In a randomized study comprising 895 subjects aged 40 to 59 y in Odense, Denmark, subjects were randomized to receive information in terms of absolute risk reduction (ARR), relative risk reduction (RRR), number needed to treat (NNT), or prolongation of life (POL), all with respect to heart attack, and they were asked whether they would be willing to receive a specified treatment. Respondents were randomly allocated to valuing the interventions separately (either great effect or small effect) or jointly (small effect and large effect). Joint evaluation reduced the propensity to accept the intervention that offered the smallest effect. Respondents were more sensitive to scale when faced with a joint evaluation for information formats ARR, RRR, and POL but not for NNT. Evaluability bias appeared to be most pronounced for POL and ARR. Risk information appears to be prone to evaluability bias. This suggests that numeric information on health gains is difficult to evaluate in isolation. Consequently, such information may bear too little weight in separate evaluations of risk-reducing interventions.

  10. Quantitative Risk - Phases 1 & 2

    DTIC Science & Technology

    2013-11-12

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle

  11. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  12. Adding an alcohol-related risk score to an existing categorical risk classification for older adults: sensitivity to group differences.

    PubMed

    Wilson, Sandra R; Fink, Arlene; Verghese, Shinu; Beck, John C; Nguyen, Khue; Lavori, Philip

    2007-03-01

    To evaluate a new alcohol-related risk score for research use. Using data from a previously reported trial of a screening and education system for older adults (Computerized Alcohol-Related Problems Survey), secondary analyses were conducted comparing the ability of two different measures of risk to detect post-intervention group differences: the original categorical outcome measure and a new, finely grained quantitative risk score based on the same research-based risk factors. Three primary care group practices in southern California. Six hundred sixty-five patients aged 65 and older. A previously calculated, three-level categorical classification of alcohol-related risk and a newly developed quantitative risk score. Mean post-intervention risk scores differed between the three experimental conditions: usual care, patient report, and combined report (P<.001). The difference between the combined report and usual care was significant (P<.001) and directly proportional to baseline risk. The three-level risk classification did not reveal approximately 57.3% of the intervention effect detected by the risk score. The risk score also was sufficiently sensitive to detect the intervention effect within the subset of hypertensive patients (n=112; P=.001). As an outcome measure in intervention trials, the finely grained risk score is more sensitive than the trinary risk classification. The additional clinical value of the risk score relative to the categorical measure needs to be determined.

  13. Risk evaluation mitigation strategies: the evolution of risk management policy.

    PubMed

    Hollingsworth, Kristen; Toscani, Michael

    2013-04-01

    The United States Food and Drug Administration (FDA) has the primary regulatory responsibility to ensure that medications are safe and effective both prior to drug approval and while the medication is being actively marketed by manufacturers. The responsibility for safe medications prior to marketing was signed into law in 1938 under the Federal Food, Drug, and Cosmetic Act; however, a significant risk management evolution has taken place since 1938. Additional federal rules, entitled the Food and Drug Administration Amendments Act, were established in 2007 and extended the government's oversight through the addition of a Risk Evaluation and Mitigation Strategy (REMS) for certain drugs. REMS is a mandated strategy to manage a known or potentially serious risk associated with a medication or biological product. Reasons for this extension of oversight were driven primarily by the FDA's movement to ensure that patients and providers are better informed of drug therapies and their specific benefits and risks prior to initiation. This article provides an historical perspective of the evolution of medication risk management policy and includes a review of REMS programs, an assessment of the positive and negative aspects of REMS, and provides suggestions for planning and measuring outcomes. In particular, this publication presents an overview of the evolution of the REMS program and its implications.

  14. Quantitative evaluation of malignant gliomas damage induced by photoactivation of IR700 dye

    NASA Astrophysics Data System (ADS)

    Sakuma, Morito; Kita, Sayaka; Higuchi, Hideo

    2016-01-01

    The processes involved in malignant gliomas damage were quantitatively evaluated by microscopy. The near-infrared fluorescent dye IR700 that is conjugated to an anti-CD133 antibody (IR700-CD133) specifically targets malignant gliomas (U87MG) and stem cells (BT142) and is endocytosed into the cells. The gliomas are then photodamaged by the release of reactive oxygen species (ROS) and the heat induced by illumination of IR700 by a red laser, and the motility of the vesicles within these cells is altered as a result of cellular damage. To investigate these changes in motility, we developed a new method that measures fluctuations in the intensity of phase-contrast images obtained from small areas within cells. The intensity fluctuation in U87MG cells gradually decreased as cell damage progressed, whereas the fluctuation in BT142 cells increased. The endocytosed IR700 dye was co-localized in acidic organelles such as endosomes and lysosomes. The pH in U87MG cells, as monitored by a pH indicator, was decreased and then gradually increased by the illumination of IR700, while the pH in BT142 cells increased monotonically. In these experiments, the processes of cell damage were quantitatively evaluated according to the motility of vesicles and changes in pH.

  15. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges.

    PubMed

    Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.

  16. Evaluating Attitudes, Skill, and Performance in a Learning-Enhanced Quantitative Methods Course: A Structural Modeling Approach.

    ERIC Educational Resources Information Center

    Harlow, Lisa L.; Burkholder, Gary J.; Morrow, Jennifer A.

    2002-01-01

    Used a structural modeling approach to evaluate relations among attitudes, initial skills, and performance in a Quantitative Methods course that involved students in active learning. Results largely confirmed hypotheses offering support for educational reform efforts that propose actively involving students in the learning process, especially in…

  17. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  18. Quantitative risk-benefit analysis of fish consumption for women of child-bearing age in Hong Kong.

    PubMed

    Chen, M Y Y; Wong, W W K; Chung, S W C; Tran, C H; Chan, B T P; Ho, Y Y; Xiao, Y

    2014-01-01

    Maternal fish consumption is associated with both risks from methylmercury (MeHg) and beneficial effects from omega-3 fatty acids to the developing foetal brain. This paper assessed the dietary exposure to MeHg of women of child-bearing age (20-49 years) in Hong Kong, and conducted risk-benefit analysis in terms of the effects in children's intelligent quotient (IQ) based on local data and the quantitative method derived by the expert consultation of FAO/WHO. Results showed that average and high consumers consume 450 and 1500 g of fish (including seafood) per week, respectively. About 11% of women of child-bearing age had a dietary exposure to MeHg exceeding the PTWI of 1.6 µg kg(-1) bw. In pregnant women MeHg intake may pose health risks to the developing foetuses. For average consumers, eating any of the 19 types of the most commonly consumed fish and seafood during pregnancy would result in 0.79-5.7 IQ points gain by their children. For high consumers, if they only ate tuna during pregnancy, it would cause 2.3 IQ points reduction in their children. The results indicated that for pregnant women the benefit outweighed the risk associated with eating fish if they consume different varieties of fish in moderation.

  19. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  20. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF RISK MANAGEMENT TECHNIQUES

    EPA Science Inventory

    Often, the performance of risk management techniques is evaluated by measuring the concentrations of the chemials of concern before and after risk management effoprts. However, using bioassays and chemical data provides a more robust understanding of the effectiveness of risk man...

  1. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  2. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  3. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  4. Meta-analytic evaluation of the association between head injury and risk of amyotrophic lateral sclerosis.

    PubMed

    Watanabe, Yukari; Watanabe, Takamitsu

    2017-10-01

    Head injury is considered as a potential risk factor for amyotrophic lateral sclerosis (ALS). However, several recent studies have suggested that head injury is not a cause, but a consequence of latent ALS. We aimed to evaluate such a possibility of reverse causation with meta-analyses considering time lags between the incidence of head injuries and the occurrence of ALS. We searched Medline and Web of Science for case-control, cross-sectional, or cohort studies that quantitatively investigated the head-injury-related risk of ALS and were published until 1 December 2016. After selecting appropriate publications based on PRISMA statement, we performed random-effects meta-analyses to calculate odds ratios (ORs) and 95% confidence intervals (CI). Sixteen of 825 studies fulfilled the eligibility criteria. The association between head injuries and ALS was statistically significant when the meta-analysis included all the 16 studies (OR 1.45, 95% CI 1.21-1.74). However, in the meta-analyses considering the time lags between the experience of head injuries and diagnosis of ALS, the association was weaker (OR 1.21, 95% CI 1.01-1.46, time lag ≥ 1 year) or not significant (e.g. OR 1.16, 95% CI 0.84-1.59, time lag ≥ 3 years). Although it did not deny associations between head injuries and ALS, the current study suggests a possibility that such a head-injury-oriented risk of ALS has been somewhat overestimated. For more accurate evaluation, it would be necessary to conduct more epidemiological studies that consider the time lags between the occurrence of head injuries and the diagnosis of ALS.

  5. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  6. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  7. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  8. Risk compensation behaviours in construction workers' activities.

    PubMed

    Feng, Yingbin; Wu, Peng

    2015-01-01

    The purpose of this study was to test whether the construction workers have the tendency of engaging in risk compensation behaviours, and identify the demographic variables, which may influence the extent to which the construction workers may show risk compensation behaviours. Both quantitative (survey) and qualitative (interviews) approaches were used in this study. A questionnaire survey was conducted with all the construction workers on three building construction sites of a leading construction company in Australia. Semi-structured interviews were then conducted to validate the findings of the quantitative research. The findings indicate that workers tend to show risk compensation behaviours in the construction environment. The workers with more working experience, higher education, or having never been injured at work before have a higher tendency to show risk compensation in their activities than the others. The implication is that contractors need to assess the potential influence of workers' risk compensation behaviours when evaluating the effect of risk control measures. It is recommended that supervisors pay more attention to the behavioural changes of those workers who have more experience, higher education, and have never been injured before after the implementation of new safety control measures on construction site.

  9. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  10. The role of risk perception in making flood risk management more effective

    NASA Astrophysics Data System (ADS)

    Buchecker, M.; Salvini, G.; Di Baldassarre, G.; Semenzin, E.; Maidl, E.; Marcomini, A.

    2013-11-01

    Over the last few decades, Europe has suffered from a number of severe flood events and, as a result, there has been a growing interest in probing alternative approaches to managing flood risk via prevention measures. A literature review reveals that, although in the last decades risk evaluation has been recognized as key element of risk management, and risk assessment methodologies (including risk analysis and evaluation) have been improved by including social, economic, cultural, historical and political conditions, the theoretical schemes are not yet applied in practice. One main reason for this shortcoming is that risk perception literature is mainly of universal and theoretical nature and cannot provide the necessary details to implement a comprehensive risk evaluation. This paper therefore aims to explore a procedure that allows the inclusion of stakeholders' perceptions of prevention measures in risk assessment. It proposes to adopt methods of risk communication (both one-way and two-way communication) in risk assessment with the final aim of making flood risk management more effective. The proposed procedure not only focuses on the effect of discursive risk communication on risk perception, and on achieving a shared assessment of the prevention alternatives, but also considers the effects of the communication process on perceived uncertainties, accepted risk levels, and trust in the managing institutions. The effectiveness of this combined procedure has been studied and illustrated using the example of the participatory flood prevention assessment process on the Sihl River in Zurich, Switzerland. The main findings of the case study suggest that the proposed procedure performed well, but that it needs some adaptations for it to be applicable in different contexts and to allow a (semi-) quantitative estimation of risk perception to be used as an indicator of adaptive capacity.

  11. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  12. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  13. [Study on the quantitative evaluation on the degree of TCM basic syndromes often encountered in patients with primary liver cancer].

    PubMed

    Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng

    2007-07-01

    To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.

  14. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    PubMed

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and

  16. Evaluation of polygenic risk scores for predicting breast and prostate cancer risk.

    PubMed

    Machiela, Mitchell J; Chen, Chia-Yen; Chen, Constance; Chanock, Stephen J; Hunter, David J; Kraft, Peter

    2011-09-01

    Recently, polygenic risk scores (PRS) have been shown to be associated with certain complex diseases. The approach has been based on the contribution of counting multiple alleles associated with disease across independent loci, without requiring compelling evidence that every locus had already achieved definitive genome-wide statistical significance. Whether PRS assist in the prediction of risk of common cancers is unknown. We built PRS from lists of genetic markers prioritized by their association with breast cancer (BCa) or prostate cancer (PCa) in a training data set and evaluated whether these scores could improve current genetic prediction of these specific cancers in independent test samples. We used genome-wide association data on 1,145 BCa cases and 1,142 controls from the Nurses' Health Study and 1,164 PCa cases and 1,113 controls from the Prostate Lung Colorectal and Ovarian Cancer Screening Trial. Ten-fold cross validation was used to build and evaluate PRS with 10 to 60,000 independent single nucleotide polymorphisms (SNPs). For both BCa and PCa, the models that included only published risk alleles maximized the cross-validation estimate of the area under the ROC curve (0.53 for breast and 0.57 for prostate). We found no significant evidence that PRS using common variants improved risk prediction for BCa and PCa over replicated SNP scores. © 2011 Wiley-Liss, Inc.

  17. Development and evaluation of a risk communication curriculum for medical students.

    PubMed

    Han, Paul K J; Joekes, Katherine; Elwyn, Glyn; Mazor, Kathleen M; Thomson, Richard; Sedgwick, Philip; Ibison, Judith; Wong, John B

    2014-01-01

    To develop, pilot, and evaluate a curriculum for teaching clinical risk communication skills to medical students. A new experience-based curriculum, "Risk Talk," was developed and piloted over a 1-year period among students at Tufts University School of Medicine. An experimental study of 2nd-year students exposed vs. unexposed to the curriculum was conducted to evaluate the curriculum's efficacy. Primary outcome measures were students' objective (observed) and subjective (self-reported) risk communication competence; the latter was assessed using an Observed Structured Clinical Examination (OSCE) employing new measures. Twenty-eight 2nd-year students completed the curriculum, and exhibited significantly greater (p<.001) objective and subjective risk communication competence than a convenience sample of 24 unexposed students. New observational measures of objective competence in risk communication showed promising evidence of reliability and validity. The curriculum was resource-intensive. The new experience-based clinical risk communication curriculum was efficacious, although resource-intensive. More work is needed to develop the feasibility of curriculum delivery, and to improve the measurement of competence in clinical risk communication. Risk communication is an important advanced communication skill, and the Risk Talk curriculum provides a model educational intervention and new assessment tools to guide future efforts to teach and evaluate this skill. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Quantitative risk assessment of E. coli in street-vended cassava-based delicacies in the Philippines

    NASA Astrophysics Data System (ADS)

    Mesias, I. C. P.

    2018-01-01

    In the Philippines, rootcrop-based food products are gaining popularity in street food trade. However, a number of street-vended food products in the country are reported to be contaminated with E. coli posing possible risk among consumers. In this study, information on quantitative risk assessment of E. coli in street-vended cassava-based delicacies was generated. The assessment started with the prevalence and concentration of E. coli at post production in packages of the cassava-based delicacies. Combase growth predictor was used to trace the microbial population of E. coli in each step of the food chain. The @Risk software package, version 6 (Palisade USA) was used to run the simulations. Scenarios in the post-production to consumption pathway were simulated. The effect was then assessed in relation to exposure to the defined infective dose. In the worst case scenario, a minimum and most likely concentration of 6.3 and 7.8 log CFU of E. coli per serving respectively were observed. The simulation revealed that lowering the temperature in the chain considerably decreased the E. coli concentration prior to consumption and subsequently decreased the percentage of exposure to the infective dose. Exposure to infective dose however was increased with longer lag time from postproduction to consumption.

  19. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    PubMed

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Quantitative evaluation of palatal bone thickness for the placement of orthodontic miniscrews in adults with different facial types

    PubMed Central

    Wang, Yunji; Qiu, Ye; Liu, Henglang; He, Jinlong; Fan, Xiaoping

    2017-01-01

    Objectives: To quantitatively evaluate palatal bone thickness in adults with different facial types using cone beam computed tomography (CBCT). Methods: The CBCT volumetric data of 123 adults (mean age, 26.8 years) collected between August 2014 and August 2016 was retrospectively studied. The subjects were divided into a low-angle group (39 subjects), a normal-angle group (48 subjects) and a high-angle group (36 subjects) based on facial types assigned by cephalometric radiography. The thickness of the palatal bone was assessed at designated points. A repeated-measure analysis of variance (rm-ANOVA) test was used to test the relationship between facial types and palatal bone thickness. Results: Compared to the low-angle group, the high-angle group had significantly thinner palatal bones (p<0.05), except for the anterior-midline, anterior-medial and middle-midline areas. Conclusion: The safest zone for the placement of microimplants is the anterior part of the paramedian palate. Clinicians should pay special attention to the probability of thinner bone plates and the risk of perforation in high-angle patients. PMID:28917071

  1. Synthesis strategy: building a culturally sensitive mid-range theory of risk perception using literary, quantitative, and qualitative methods.

    PubMed

    Siaki, Leilani A; Loescher, Lois J; Trego, Lori L

    2013-03-01

    This article presents a discussion of development of a mid-range theory of risk perception. Unhealthy behaviours contribute to the development of health inequalities worldwide. The link between perceived risk and successful health behaviour change is inconclusive, particularly in vulnerable populations. This may be attributed to inattention to culture. The synthesis strategy of theory building guided the process using three methods: (1) a systematic review of literature published between 2000-2011 targeting perceived risk in vulnerable populations; (2) qualitative and (3) quantitative data from a study of Samoan Pacific Islanders at high risk of cardiovascular disease and diabetes. Main concepts of this theory include risk attention, appraisal processes, cognition, and affect. Overarching these concepts is health-world view: cultural ways of knowing, beliefs, values, images, and ideas. This theory proposes the following: (1) risk attention varies based on knowledge of the health risk in the context of health-world views; (2) risk appraisals are influenced by affect, health-world views, cultural customs, and protocols that intersect with the health risk; (3) strength of cultural beliefs, values, and images (cultural identity) mediate risk attention and risk appraisal influencing the likelihood that persons will engage in health-promoting behaviours that may contradict cultural customs/protocols. Interventions guided by a culturally sensitive mid-range theory may improve behaviour-related health inequalities in vulnerable populations. The synthesis strategy is an intensive process for developing a culturally sensitive mid-range theory. Testing of the theory will ascertain its usefulness for reducing health inequalities in vulnerable groups. © 2012 Blackwell Publishing Ltd.

  2. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  3. Role of imaging in evaluation of sudden cardiac death risk in hypertrophic cardiomyopathy.

    PubMed

    Geske, Jeffrey B; Ommen, Steve R

    2015-09-01

    Hypertrophic cardiomyopathy (HCM) is the most common heritable cardiomyopathy and is associated with sudden cardiac death (SCD) - an uncommon but devastating clinical outcome. This review is designed to assess the role of imaging in established risk factor assessment and its role in emerging SCD risk stratification. Recent publications have highlighted the crucial role of imaging in HCM SCD risk stratification. Left ventricular hypertrophy assessment remains the key imaging determinant of risk. Data continue to emerge on the role of systolic dysfunction, apical aneurysms, left atrial enlargement and left ventricular outflow tract obstruction as markers of risk. Quantitative assessment of delayed myocardial enhancement and T1 mapping on cardiac MRI continue to evolve. Recent multicenter trials have allowed multivariate SCD risk assessment in large HCM cohorts. Given aggregate risk with presence of multiple risk factors, a single parameter should not be used in isolation to determine implantable cardiac defibrillator candidacy. Use of all available imaging data, including cardiac magnetic resonance tissue characterization, allows a comprehensive approach to SCD stratification and implantable cardiac defibrillator decision-making.

  4. Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model

    NASA Astrophysics Data System (ADS)

    Niu, Wei; Wang, Xifu

    2018-01-01

    The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.

  5. Economic evaluation of HIV pre-exposure prophylaxis strategies: protocol for a methodological systematic review and quantitative synthesis.

    PubMed

    Thavorn, Kednapa; Kugathasan, Howsikan; Tan, Darrell H S; Moqueet, Nasheed; Baral, Stefan D; Skidmore, Becky; MacFadden, Derek; Simkin, Anna; Mishra, Sharmistha

    2018-03-15

    Pre-exposure prophylaxis (PrEP) with antiretrovirals is an efficacious and effective intervention to decrease the risk of HIV (human immunodeficiency virus) acquisition. Yet drug and delivery costs prohibit access in many jurisdictions. In the absence of guidelines for the synthesis of economic evaluations, we developed a protocol for a systematic review of economic evaluation studies for PrEP by drawing on best practices in systematic reviews and the conduct and reporting of economic evaluations. We aim to estimate the incremental cost per health outcome of PrEP compared with placebo, no PrEP, or other HIV prevention strategies; assess the methodological variability in, and quality of, economic evaluations of PrEP; estimate the incremental cost per health outcome of different PrEP implementation strategies; and quantify the potential sources of heterogeneity in outcomes. We will systematically search electronic databases (MEDLINE, Embase) and the gray literature. We will include economic evaluation studies that assess both costs and health outcomes of PrEP in HIV-uninfected individuals, without restricting language or year of publication. Two reviewers will independently screen studies using predefined inclusion criteria, extract data, and assess methodological quality using the Philips checklist, Second Panel on the Cost-effectiveness of Health and Medicines, and the International Society for Pharmacoeconomics and Outcomes Research recommendations. Outcomes of interest include incremental costs and outcomes in natural units or utilities, cost-effectiveness ratios, and net monetary benefit. We will perform descriptive and quantitative syntheses using sensitivity analyses of outcomes by population subgroups, HIV epidemic settings, study designs, baseline intervention contexts, key parameter inputs and assumptions, type of outcomes, economic perspectives, and willingness to pay values. Findings will guide future economic evaluation of PrEP strategies in terms of

  6. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  7. Geographic and demographic variabilities of quantitative parameters in stress myocardial computed tomography perfusion.

    PubMed

    Park, Jinoh; Kim, Hyun-Sook; Hwang, Hye Jeon; Yang, Dong Hyun; Koo, Hyun Jung; Kang, Joon-Won; Kim, Young-Hak

    2017-09-01

    To evaluate the geographic and demographic variabilities of the quantitative parameters of computed tomography perfusion (CTP) of the left ventricular (LV) myocardium in patients with normal coronary artery on computed tomography angiography (CTA). From a multicenter CTP registry of stress and static computed tomography, we retrospectively recruited 113 patients (mean age, 60 years; 57 men) without perfusion defect on visual assessment and minimal (< 20% of diameter stenosis) or no coronary artery disease on CTA. Using semiautomatic analysis software, quantitative parameters of the LV myocardium, including the myocardial attenuation in stress and rest phases, transmural perfusion ratio (TPR), and myocardial perfusion reserve index (MPRI), were evaluated in 16 myocardial segments. In the lateral wall of the LV myocardium, all quantitative parameters except for MPRI were significantly higher compared with those in the other walls. The MPRI showed consistent values in all myocardial walls (anterior to lateral wall: range, 25% to 27%; p = 0.401). At the basal level of the myocardium, all quantitative parameters were significantly lower than those at the mid- and apical levels. Compared with men, women had significantly higher values of myocardial attenuation and TPR. Age, body mass index, and Framingham risk score were significantly associated with the difference in myocardial attenuation. Geographic and demographic variabilities of quantitative parameters in stress myocardial CTP exist in healthy subjects without significant coronary artery disease. This information may be helpful when assessing myocardial perfusion defects in CTP.

  8. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  9. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  10. [Health risk assessment of coke oven PAHs emissions].

    PubMed

    Bo, Xin; Wang, Gang; Wen, Rou; Zhao, Chun-Li; Wu, Tie; Li, Shi-Bei

    2014-07-01

    Polycyclic aromatic hydrocarbons (PAHs) produced by coke oven are with strong toxicity and carcinogenicity. Taken typical coke oven of iron and steel enterprises as the case study, the dispersion and migration of 13 kinds of PAHs emitted from coke oven were analyzed using AERMOD dispersion model, the carcinogenic and non-carcinogenic risks at the receptors within the modeling domain were evaluated using BREEZE Risk Analyst and the Human Health Risk Assessment Protocol for Hazardous Waste Combustion (HHRAP) was followed, the health risks caused by PAHs emission from coke oven were quantitatively evaluated. The results indicated that attention should be paid to the non-carcinogenic risk of naphthalene emission (the maximum value was 0.97). The carcinogenic risks of each single pollutant were all below 1.0E-06, while the maximum value of total carcinogenic risk was 2.65E-06, which may have some influence on the health of local residents.

  11. Quantitative falls risk estimation through multi-sensor assessment of standing balance.

    PubMed

    Greene, Barry R; McGrath, Denise; Walsh, Lorcan; Doheny, Emer P; McKeown, David; Garattini, Chiara; Cunningham, Clodagh; Crosby, Lisa; Caulfield, Brian; Kenny, Rose A

    2012-12-01

    Falls are the most common cause of injury and hospitalization and one of the principal causes of death and disability in older adults worldwide. Measures of postural stability have been associated with the incidence of falls in older adults. The aim of this study was to develop a model that accurately classifies fallers and non-fallers using novel multi-sensor quantitative balance metrics that can be easily deployed into a home or clinic setting. We compared the classification accuracy of our model with an established method for falls risk assessment, the Berg balance scale. Data were acquired using two sensor modalities--a pressure sensitive platform sensor and a body-worn inertial sensor, mounted on the lower back--from 120 community dwelling older adults (65 with a history of falls, 55 without, mean age 73.7 ± 5.8 years, 63 female) while performing a number of standing balance tasks in a geriatric research clinic. Results obtained using a support vector machine yielded a mean classification accuracy of 71.52% (95% CI: 68.82-74.28) in classifying falls history, obtained using one model classifying all data points. Considering male and female participant data separately yielded classification accuracies of 72.80% (95% CI: 68.85-77.17) and 73.33% (95% CI: 69.88-76.81) respectively, leading to a mean classification accuracy of 73.07% in identifying participants with a history of falls. Results compare favourably to those obtained using the Berg balance scale (mean classification accuracy: 59.42% (95% CI: 56.96-61.88)). Results from the present study could lead to a robust method for assessing falls risk in both supervised and unsupervised environments.

  12. Quantitative evaluation of protocorm growth and fungal colonization in Bletilla striata (Orchidaceae) reveals less-productive symbiosis with a non-native symbiotic fungus.

    PubMed

    Yamamoto, Tatsuki; Miura, Chihiro; Fuji, Masako; Nagata, Shotaro; Otani, Yuria; Yagame, Takahiro; Yamato, Masahide; Kaminaka, Hironori

    2017-02-21

    In nature, orchid plants depend completely on symbiotic fungi for their nutrition at the germination and the subsequent seedling (protocorm) stages. However, only limited quantitative methods for evaluating the orchid-fungus interactions at the protocorm stage are currently available, which greatly constrains our understanding of the symbiosis. Here, we aimed to improve and integrate quantitative evaluations of the growth and fungal colonization in the protocorms of a terrestrial orchid, Blettila striata, growing on a plate medium. We achieved both symbiotic and asymbiotic germinations for the terrestrial orchid B. striata. The protocorms produced by the two germination methods grew almost synchronously for the first three weeks. At week four, however, the length was significantly lower in the symbiotic protocorms. Interestingly, the dry weight of symbiotic protocorms did not significantly change during the growth period, which implies that there was only limited transfer of carbon compounds from the fungus to the protocorms in this relationship. Next, to evaluate the orchid-fungus interactions, we developed an ink-staining method to observe the hyphal coils in protocorms without preparing thin sections. Crushing the protocorm under the coverglass enables us to observe all hyphal coils in the protocorms with high resolution. For this observation, we established a criterion to categorize the stages of hyphal coils, depending on development and degradation. By counting the symbiotic cells within each stage, it was possible to quantitatively evaluate the orchid-fungus symbiosis. We describe a method for quantitative evaluation of orchid-fungus symbiosis by integrating the measurements of plant growth and fungal colonization. The current study revealed that although fungal colonization was observed in the symbiotic protocorms, the weight of the protocorm did not significantly increase, which is probably due to the incompatibility of the fungus in this symbiosis. These

  13. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Campbell, Philip L

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  14. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging

    PubMed Central

    Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-01-01

    Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the

  15. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  16. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  17. Objective and quantitative equilibriometric evaluation of individual locomotor behaviour in schizophrenia: Translational and clinical implications.

    PubMed

    Haralanov, Svetlozar; Haralanova, Evelina; Milushev, Emil; Shkodrova, Diana; Claussen, Claus-Frenz

    2018-04-17

    Psychiatry is the only medical specialty that lacks clinically applicable biomarkers for objective evaluation of the existing pathology at a single-patient level. On the basis of an original translational equilibriometric method for evaluation of movement patterns, we have introduced in the everyday clinical practice of psychiatry an easy-to-perform computerized objective quantification of the individual locomotor behaviour during execution of the Unterberger stepping test. For the last 20 years, we have gradually collected a large database of more than 1000 schizophrenic patients, their relatives, and matched psychiatric, neurological, and healthy controls via cross-sectional and longitudinal investigations. Comparative analyses revealed transdiagnostic locomotor similarities among schizophrenic patients, high-risk schizotaxic individuals, and neurological patients with multiple sclerosis and cerebellar ataxia, thus suggesting common underlying brain mechanisms. In parallel, intradiagnostic dissimilarities were revealed, which allow to separate out subclinical locomotor subgroups within the diagnostic categories. Prototypical qualitative (dysmetric and ataxic) locomotor abnormalities in schizophrenic patients were differentiated from 2 atypical quantitative ones, manifested as either hypolocomotion or hyperlocomotion. Theoretical analyses suggested that these 3 subtypes of locomotor abnormalities could be conceived as objectively measurable biomarkers of 3 schizophrenic subgroups with dissimilar brain mechanisms, which require different treatment strategies. Analogies with the prominent role of locomotor measures in some well-known animal models of mental disorders advocate for a promising objective translational research in the so far over-subjective field of psychiatry. Distinctions among prototypical, atypical, and diagnostic biomarkers, as well as between neuromotor and psychomotor locomotor abnormalities, are discussed. Conclusions are drawn about the

  18. Risk assessment in the North Caucasus ski resorts

    NASA Astrophysics Data System (ADS)

    Komarov, Anton Y.; Seliverstov, Yury G.; Glazovskaya, Tatyana G.; Turchaninova, Alla S.

    2016-10-01

    Avalanches pose a significant problem in most mountain regions of Russia. The constant growth of economic activity, and therefore the increased avalanche hazard, in the North Caucasus region lead to demand for the development of large-scale avalanche risk assessment methods. Such methods are needed for the determination of appropriate avalanche protection measures as well as for economic assessments.The requirement of natural hazard risk assessments is determined by the Federal Law of the Russian Federation (Federal Law 21.12.1994 N 68-FZ, 2016). However, Russian guidelines (SNIP 11-02-96, 2013; SNIP 22-02-2003, 2012) are not clearly presented concerning avalanche risk assessment calculations. Thus, we discuss these problems by presenting a new avalanche risk assessment approach, with the example of developing but poorly researched ski resort areas. The suggested method includes the formulas to calculate collective and individual avalanche risk. The results of risk analysis are shown in quantitative data that can be used to determine levels of avalanche risk (appropriate, acceptable and inappropriate) and to suggest methods to decrease the individual risk to an acceptable level or better. The analysis makes it possible to compare risk quantitative data obtained from different regions, analyze them and evaluate the economic feasibility of protection measures.

  19. Bone strength measured by peripheral quantitative computed tomography and the risk of nonvertebral fractures: the osteoporotic fractures in men (MrOS) study.

    PubMed

    Sheu, Yahtyng; Zmuda, Joseph M; Boudreau, Robert M; Petit, Moira A; Ensrud, Kristine E; Bauer, Douglas C; Gordon, Christopher L; Orwoll, Eric S; Cauley, Jane A

    2011-01-01

    Many fractures occur in individuals without osteoporosis defined by areal bone mineral density (aBMD). Inclusion of other aspects of skeletal strength may be useful in identifying at-risk subjects. We used surrogate measures of bone strength at the radius and tibia measured by peripheral quantitative computed tomography (pQCT) to evaluate their relationships with nonvertebral fracture risk. Femoral neck (FN) aBMD, measured by dual-energy X-ray absorptiometry (DXA), also was included. The study population consisted of 1143 white men aged 69+ years with pQCT measures at the radius and tibia from the Minneapolis and Pittsburgh centers of the Osteoporotic Fractures in Men (MrOS) study. Principal-components analysis and Cox proportional-hazards modeling were used to identify 21 of 58 pQCT variables with a major contribution to nonvertebral incident fractures. After a mean 2.9 years of follow-up, 39 fractures occurred. Men without incident fractures had significantly greater bone mineral content, cross-sectional area, and indices of bone strength than those with fractures by pQCT. Every SD decrease in the 18 of 21 pQCT parameters was significantly associated with increased fracture risk (hazard ration ranged from 1.4 to 2.2) independent of age, study site, body mass index (BMI), and FN aBMD. Using area under the receiver operation characteristics curve (AUC), the combination of FN aBMD and three radius strength parameters individually increased fracture prediction over FN aBMD alone (AUC increased from 0.73 to 0.80). Peripheral bone strength measures are associated with fracture risk and may improve our ability to identify older men at high risk of fracture. © 2011 American Society for Bone and Mineral Research.

  20. Occupational health and safety: Designing and building with MACBETH a value risk-matrix for evaluating health and safety risks

    NASA Astrophysics Data System (ADS)

    Lopes, D. F.; Oliveira, M. D.; Costa, C. A. Bana e.

    2015-05-01

    Risk matrices (RMs) are commonly used to evaluate health and safety risks. Nonetheless, they violate some theoretical principles that compromise their feasibility and use. This study describes how multiple criteria decision analysis methods have been used to improve the design and the deployment of RMs to evaluate health and safety risks at the Occupational Health and Safety Unit (OHSU) of the Regional Health Administration of Lisbon and Tagus Valley. ‘Value risk-matrices’ (VRMs) are built with the MACBETH approach in four modelling steps: a) structuring risk impacts, involving the construction of descriptors of impact that link risk events with health impacts and are informed by scientific evidence; b) generating a value measurement scale of risk impacts, by applying the MACBETH-Choquet procedure; c) building a system for eliciting subjective probabilities that makes use of a numerical probability scale that was constructed with MACBETH qualitative judgments on likelihood; d) and defining a classification colouring scheme for the VRM. A VRM built with OHSU members was implemented in a decision support system which will be used by OHSU members to evaluate health and safety risks and to identify risk mitigation actions.

  1. Risk management study for the retired Hanford Site facilities: Qualitative risk evaluation for the retired Hanford Site facilities. Volume 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, G.A.; Shultz, M.V.; Taylor, W.E.

    1993-09-01

    This document provides a risk evaluation of the 100 and 200 Area retired, surplus facilities on the Hanford Site. Also included are the related data that were compiled by the risk evaluation team during investigations performed on the facilities. Results are the product of a major effort performed in fiscal year 1993 to produce qualitative information that characterizes certain risks associated with these facilities. The retired facilities investigated for this evaluation are located in the 100 and 200 Areas of the 1,450-km{sup 2} (570-mi{sup 2}) Hanford Site. The Hanford Site is a semiarid tract of land in southeastern Washington State.more » The nearest population center is Richland, Washington, (population 32,000) 30-km (20 mi) southeast of the 200 Area. During walkdown investigations of these facilities, data on real and potential hazards that threatened human health or safety or created potential environmental release issues were identified by the risk evaluation team. Using these findings, the team categorized the identified hazards by facility and evaluated the risk associated with each hazard. The factors contributing to each risk, and the consequence and likelihood of harm associated with each hazard also are included in this evaluation.« less

  2. Quantitative risk assessment of CO2 transport by pipelines--a review of uncertainties and their impacts.

    PubMed

    Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André

    2010-05-15

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  3. Benefit-risk Evaluation for Diagnostics: A Framework (BED-FRAME).

    PubMed

    Evans, Scott R; Pennello, Gene; Pantoja-Galicia, Norberto; Jiang, Hongyu; Hujer, Andrea M; Hujer, Kristine M; Manca, Claudia; Hill, Carol; Jacobs, Michael R; Chen, Liang; Patel, Robin; Kreiswirth, Barry N; Bonomo, Robert A

    2016-09-15

    The medical community needs systematic and pragmatic approaches for evaluating the benefit-risk trade-offs of diagnostics that assist in medical decision making. Benefit-Risk Evaluation of Diagnostics: A Framework (BED-FRAME) is a strategy for pragmatic evaluation of diagnostics designed to supplement traditional approaches. BED-FRAME evaluates diagnostic yield and addresses 2 key issues: (1) that diagnostic yield depends on prevalence, and (2) that different diagnostic errors carry different clinical consequences. As such, evaluating and comparing diagnostics depends on prevalence and the relative importance of potential errors. BED-FRAME provides a tool for communicating the expected clinical impact of diagnostic application and the expected trade-offs of diagnostic alternatives. BED-FRAME is a useful fundamental supplement to the standard analysis of diagnostic studies that will aid in clinical decision making. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  4. Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, R.N.; Cooper, M.D.

    1989-09-01

    This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less

  5. Quantitative risk estimation for large for gestational age using the area under the 100-g oral glucose tolerance test curve.

    PubMed

    Kim, Sollip; Min, Won-Ki; Chun, Sail; Lee, Woochang; Chung, Hee-Jung; Lee, Pil Ryang; Kim, Ahm

    2009-01-01

    We devised a complementary quantitative method for gestational diabetes (GDM) that uses the area under the curve (AUC) of the results of the oral glucose tolerance test (OGTT), and evaluated its efficacy in predicting neonates that would be large for gestational age (LGA). The study subjects were 648 pregnant women. The AUC-OGTT (concentration x time) was calculated from the 100-g OGTT results. The incidence of LGA according to each range of the AUC-OGTT was estimated and odds ratios were analyzed using multiple logistic regression analysis.The incidence of LGA increased with the AUC-OGTT value and was 0% for AUC<300, 7.8% for 300-400, 14.9% for 400-500, 20.8% for 500-600, and 45.5% for > or = 600. The odds ratio of LGA increased by approximately two-fold with an increase of 100 in the AUC-OGTT. The results indicated that the AUC-OGTT can be used to quantify the risk of LGA in GDM. The AUC-OGTT could complement a diagnosis of GDM using conventional diagnostic criteria.

  6. Global and local health burden trade-off through the hybridisation of quantitative microbial risk assessment and life cycle assessment to aid water management.

    PubMed

    Kobayashi, Yumi; Peters, Greg M; Ashbolt, Nicholas J; Heimersson, Sara; Svanström, Magdalena; Khan, Stuart J

    2015-08-01

    Life cycle assessment (LCA) and quantitative risk assessment (QRA) are commonly used to evaluate potential human health impacts associated with proposed or existing infrastructure and products. Each approach has a distinct objective and, consequently, their conclusions may be inconsistent or contradictory. It is proposed that the integration of elements of QRA and LCA may provide a more holistic approach to health impact assessment. Here we examine the possibility of merging LCA assessed human health impacts with quantitative microbial risk assessment (QMRA) for waterborne pathogen impacts, expressed with the common health metric, disability adjusted life years (DALYs). The example of a recent large-scale water recycling project in Sydney, Australia was used to identify and demonstrate the potential advantages and current limitations of this approach. A comparative analysis of two scenarios - with and without the development of this project - was undertaken for this purpose. LCA and QMRA were carried out independently for the two scenarios to compare human health impacts, as measured by DALYs lost per year. LCA results suggested that construction of the project would lead to an increased number of DALYs lost per year, while estimated disease burden resulting from microbial exposures indicated that it would result in the loss of fewer DALYs per year than the alternative scenario. By merging the results of the LCA and QMRA, we demonstrate the advantages in providing a more comprehensive assessment of human disease burden for the two scenarios, in particular, the importance of considering the results of both LCA and QRA in a comparative assessment of decision alternatives to avoid problem shifting. The application of DALYs as a common measure between the two approaches was found to be useful for this purpose. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    ERIC Educational Resources Information Center

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  8. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  9. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  10. Quantitative risk assessment for Escherichia coli O157:H7 in frozen ground beef patties consumed by young children in French households.

    PubMed

    Delignette-Muller, M L; Cornu, M

    2008-11-30

    A quantitative risk assessment for Escherichia coli O157:H7 in frozen ground beef patties consumed by children under 10 years of age in French households was conducted by a national study group describing an outbreak which occurred in France in 2005. Our exposure assessment model incorporates results from French surveys on consumption frequency of ground beef patties, serving size and consumption preference, microbial destruction experiments and microbial counts on patties sampled from the industrial batch which were responsible for the outbreak. Two different exposure models were proposed, respectively for children under the age of 5 and for children between 5 and 10 years. For each of these two age groups, a single-hit dose-response model was proposed to describe the probability of hemolytic and uremic syndrome (HUS) as a function of the ingested dose. For each group, the single parameter of this model was estimated by Bayesian inference, using the results of the exposure assessment and the epidemiological data collected during the outbreak. Results show that children under 5 years of age are roughly 5 times more susceptible to the pathogen than children over 5 years. Exposure and dose-response models were used in a scenario analysis in order to validate the use of the model and to propose appropriate guidelines in order to prevent new outbreaks. The impact of the cooking preference was evaluated, showing that only a well-done cooking notably reduces the HUS risk, without annulling it. For each age group, a relation between the mean individual HUS risk per serving and the contamination level in a ground beef batch was proposed, as a tool to help French risk managers.

  11. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior.

    PubMed

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-01-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  12. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  13. Testing decision rules for categorizing species' extinction risk to help develop quantitative listing criteria for the U.S. Endangered Species Act.

    PubMed

    Regan, Tracey J; Taylor, Barbara L; Thompson, Grant G; Cochrane, Jean Fitts; Ralls, Katherine; Runge, Michael C; Merrick, Richard

    2013-08-01

    Lack of guidance for interpreting the definitions of endangered and threatened in the U.S. Endangered Species Act (ESA) has resulted in case-by-case decision making leaving the process vulnerable to being considered arbitrary or capricious. Adopting quantitative decision rules would remedy this but requires the agency to specify the relative urgency concerning extinction events over time, cutoff risk values corresponding to different levels of protection, and the importance given to different types of listing errors. We tested the performance of 3 sets of decision rules that use alternative functions for weighting the relative urgency of future extinction events: a threshold rule set, which uses a decision rule of x% probability of extinction over y years; a concave rule set, where the relative importance of future extinction events declines exponentially over time; and a shoulder rule set that uses a sigmoid shape function, where relative importance declines slowly at first and then more rapidly. We obtained decision cutoffs by interviewing several biologists and then emulated the listing process with simulations that covered a range of extinction risks typical of ESA listing decisions. We evaluated performance of the decision rules under different data quantities and qualities on the basis of the relative importance of misclassification errors. Although there was little difference between the performance of alternative decision rules for correct listings, the distribution of misclassifications differed depending on the function used. Misclassifications for the threshold and concave listing criteria resulted in more overprotection errors, particularly as uncertainty increased, whereas errors for the shoulder listing criteria were more symmetrical. We developed and tested the framework for quantitative decision rules for listing species under the U.S. ESA. If policy values can be agreed on, use of this framework would improve the implementation of the ESA by

  14. Evaluating the impact of climate change on landslide occurrence, hazard, and risk: from global to regional scale.

    NASA Astrophysics Data System (ADS)

    Gariano, Stefano Luigi; Guzzetti, Fausto

    2017-04-01

    . Where global warming is expected to increase, the frequency and intensity of severe rainfall events, a primary trigger of shallow, rapid-moving landslides that cause many landslide fatalities, an increase in the number of people exposed to landslide risk is to be expected. Furthermore, we defined a group of objective and reproducible methods for the quantitative evaluation of the past and future (expected) variations in landslide occurrence and distribution, and in the impact and risk to the population, as a result of changes in climatic and environmental factors (particularly, land use changes), at regional scale. The methods were tested in a southern Italian region, but they can easily applied in other physiographic and climatic regions, where adequate information is available.

  15. Multi Criteria Evaluation Module for RiskChanges Spatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; van Westen, Cees; Bakker, Wim

    2015-04-01

    Multi-Criteria Evaluation (MCE) module is one of the five modules of RiskChanges spatial decision support system. RiskChanges web-based platform aims to analyze changes in hydro-meteorological risk and provides tools for selecting the best risk reduction alternative. It is developed under CHANGES framework (changes-itn.eu) and INCREO project (increo-fp7.eu). MCE tool helps decision makers and spatial planners to evaluate, sort and rank the decision alternatives. The users can choose among different indicators that are defined within the system using Risk and Cost Benefit analysis results besides they can add their own indicators. Subsequently the system standardizes and prioritizes them. Finally, the best decision alternative is selected by using the weighted sum model (WSM). The Application of this work is to facilitate the effect of MCE for analyzing changing risk over the time under different scenarios and future years by adopting a group decision making into practice and comparing the results by numeric and graphical view within the system. We believe that this study helps decision-makers to achieve the best solution by expressing their preferences for strategies under future scenarios. Keywords: Multi-Criteria Evaluation, Spatial Decision Support System, Weighted Sum Model, Natural Hazard Risk Management

  16. Evaluating Potential Health Risks in Relocatable Classrooms.

    ERIC Educational Resources Information Center

    Katchen, Mark; LaPierre, Adrienne; Charlin, Cary; Brucker, Barry; Ferguson, Paul

    2001-01-01

    Only limited data exist describing potential exposures to chemical and biological agents when using portable classrooms or outlining how to assess and reduce associated health risks. Evaluating indoor air quality involves examining ventilating rates, volatile organic compounds, and microbiologicals. Open communication among key stakeholders is…

  17. Molecular sensitivity threshold of wet mount and an immunochromatographic assay evaluated by quantitative real-time PCR for diagnosis of Trichomonas vaginalis infection in a low-risk population of childbearing women.

    PubMed

    Leli, Christian; Castronari, Roberto; Levorato, Lucia; Luciano, Eugenio; Pistoni, Eleonora; Perito, Stefano; Bozza, Silvia; Mencacci, Antonella

    2016-06-01

    Vaginal trichomoniasis is a sexually transmitted infection caused by Trichomonas vaginalis, a flagellated protozoan. Diagnosis of T. vaginalis infection is mainly performed by wet mount microscopy, with a sensitivity ranging from 38% to 82%, compared to culture, still considered the gold standard. Commercial immunochromatographic tests for monoclonal-antibody-based detection have been introduced as alternative methods for diagnosis of T. vaginalis infection and have been reported in some studies to be more sensitive than wet mount. Real-time PCR methods have been recently developed, with optimal sensitivity and specificity. The aim of this study was to evaluate whether there is a molecular sensitivity threshold for both wet mount and imunochromatographic assays. To this aim, a total of 1487 low-risk childbearing women (median age 32 years, interquartile range 27-37) were included in the study, and underwent vaginal swab for T. vaginalis detection by means of a quantitative real-time PCR assay, wet mount and an immunochromatographic test. Upon comparing the results, prevalence values observed were 1.3% for real-time PCR, 0.5% for microscopic examination, and 0.8% for the immunochromatographic test. Compared to real-time PCR, wet mount sensitivity was 40% (95% confidence interval 19.1% to 63.9%) and specificity was 100% (95% CI 99.7% to 100%). The sensitivity and specificity of the immunochromatographic assay were 57.9% (95% CI 33.5% to 79.8%) and 99.9% (95% CI 99.6% to 100%), respectively. Evaluation of the wet mount results and those of immunochromatographic assay detection in relation to the number of T. vaginalis DNA copies detected in vaginal samples showed that the lower identification threshold for both wet mount (chi-square 6.1; P = 0.016) and the immunochromatographic assay (chi-square 10.7; P = 0.002) was ≥100 copies of T. vaginalis DNA/5 mcl of eluted DNA.

  18. The Use of Mouse Models of Breast Cancer and Quantitative Image Analysis to Evaluate Hormone Receptor Antigenicity after Microwave-assisted Formalin Fixation

    PubMed Central

    Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.

    2014-01-01

    Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322

  19. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  20. Quantitation of aortic and mitral regurgitation in the pediatric population: evaluation by radionuclide angiocardiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurwitz, R.A.; Treves, S.; Freed, M.

    The ability to quantitate aortic (AR) or mitral regurgitation (MR), or both, by radionuclide angiocardiography was evaluated in children and young adults at rest and during isometric exercise. Regurgitation was estimated by determining the ratio of left ventricular stroke volume to right ventricular stroke volume obtained during equilibrium ventriculography. The radionuclide measurement was compared with results of cineangiography, with good correlation between both studies in 47 of 48 patients. Radionuclide stroke volume ratio was used to classify severity: the group with equivocal regurgitation differed from the group with mild regurgitation (p less than 0.02); patients with mild regurgitation differed frommore » those with moderate regurgitation (p less than 0.001); and those with moderate regurgitation differed from those with severe regurgitation (p less than 0.01). The stroke volume ratio was responsive to isometric exercise, remaining constant or increasing in 16 of 18 patients. After surgery to correct regurgitation, the stroke volume ratio significantly decreased from preoperative measurements in all 7 patients evaluated. Results from the present study demonstrate that a stroke volume ratio greater than 2.0 is compatible with moderately severe regurgitation and that a ratio greater than 3.0 suggests the presence of severe regurgitation. Thus, radionuclide angiocardiography should be useful for noninvasive quantitation of AR or MR, or both, helping define the course of young patients with left-side valvular regurgitation.« less

  1. Variability in PAH-DNA adduct measurements in peripheral mononuclear cells: implications for quantitative cancer risk assessment.

    PubMed

    Dickey, C; Santella, R M; Hattis, D; Tang, D; Hsu, Y; Cooper, T; Young, T L; Perera, F P

    1997-10-01

    Biomarkers such as DNA adducts have significant potential to improve quantitative risk assessment by characterizing individual differences in metabolism of genotoxins and DNA repair and accounting for some of the factors that could affect interindividual variation in cancer risk. Inherent uncertainty in laboratory measurements and within-person variability of DNA adduct levels over time are putatively unrelated to cancer risk and should be subtracted from observed variation to better estimate interindividual variability of response to carcinogen exposure. A total of 41 volunteers, both smokers and nonsmokers, were asked to provide a peripheral blood sample every 3 weeks for several months in order to specifically assess intraindividual variability of polycyclic aromatic hydrocarbon (PAH)-DNA adduct levels. The intraindividual variance in PAH-DNA adduct levels, together with measurement uncertainty (laboratory variability and unaccounted for differences in exposure), constituted roughly 30% of the overall variance. An estimated 70% of the total variance was contributed by interindividual variability and is probably representative of the true biologic variability of response to carcinogenic exposure in lymphocytes. The estimated interindividual variability in DNA damage after subtracting intraindividual variability and measurement uncertainty was 24-fold. Inter-individual variance was higher (52-fold) in persons who constitutively lack the Glutathione S-Transferase M1 (GSTM1) gene which is important in the detoxification pathway of PAH. Risk assessment models that do not consider the variability of susceptibility to DNA damage following carcinogen exposure may underestimate risks to the general population, especially for those people who are most vulnerable.

  2. Manned Versus Unmanned Risk and Complexity Considerations for Future Midsized X-Planes

    NASA Technical Reports Server (NTRS)

    Lechniak, Jason A.; Melton, John E.

    2017-01-01

    The objective of this work was to identify and estimate complexity and risks associated with the development and testing of new low-cost medium-scale X-plane aircraft primarily focused on air transport operations. Piloting modes that were evaluated for this task were manned, remotely piloted, and unmanned flight research programs. This analysis was conducted early in the data collection period for X-plane concept vehicles before preliminary designs were complete. Over 50 different aircraft and system topics were used to evaluate the three piloting control modes. Expert group evaluations from a diverse set of pilots, engineers, and other experts at Aeronautics Research Mission Directorate centers within the National Aeronautics and Space Administration provided qualitative reasoning on the many issues surrounding the decisions regarding piloting modes. The group evaluations were numerically rated to evaluate each topic quantitatively and were used to provide independent criteria for vehicle complexity and risk. An Edwards Air Force Base instruction document was identified that emerged as a source of the effects found in our qualitative and quantitative data. The study showed that a manned aircraft was the best choice to align with test activities for transport aircraft flight research from a low-complexity and low-risk perspective. The study concluded that a manned aircraft option would minimize the risk and complexity to improve flight-test efficiency and bound the cost of the flight-test portion of the program. Several key findings and discriminators between the three modes are discussed in detail.

  3. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  4. Quantitative evaluation of 3D dosimetry for stereotactic volumetric‐modulated arc delivery using COMPASS

    PubMed Central

    Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2014-01-01

    The purpose of this study was to evaluate quantitatively the patient‐specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric‐modulated arc delivery. Twenty‐five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric‐modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5–20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)‐calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS‐calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose‐volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of

  5. [Integral quantitative evaluation of working conditions in the construction industry].

    PubMed

    Guseĭnov, A A

    1993-01-01

    Present method evaluating the quality of environment (using MAC and MAL) does not enable to assess completely and objectively the work conditions of building industry due to multiple confounding elements. A solution to this complicated problem including the analysis of various correlating elements of the system "human--work conditions--environment" may be encouraged by social norm of morbidity, which is independent on industrial and natural environment. The complete integral assessment enables to see the whole situation and reveal the points at risk.

  6. A multisite assessment of the quantitative capabilities of the Xpert MTB/RIF assay.

    PubMed

    Blakemore, Robert; Nabeta, Pamela; Davidow, Amy L; Vadwai, Viral; Tahirli, Rasim; Munsamy, Vanisha; Nicol, Mark; Jones, Martin; Persing, David H; Hillemann, Doris; Ruesch-Gerdes, Sabine; Leisegang, Felicity; Zamudio, Carlos; Rodrigues, Camilla; Boehme, Catharina C; Perkins, Mark D; Alland, David

    2011-11-01

    The Xpert MTB/RIF is an automated molecular test for Mycobacterium tuberculosis that estimates bacterial burden by measuring the threshold-cycle (Ct) of its M. tuberculosis-specific real-time polymerase chain reaction. Bacterial burden is an important biomarker for disease severity, infection control risk, and response to therapy. Evaluate bacterial load quantitation by Xpert MTB/RIF compared with conventional quantitative methods. Xpert MTB/RIF results were compared with smear-microscopy, semiquantiative solid culture, and time-to-detection in liquid culture for 741 patients and 2,008 samples tested in a multisite clinical trial. An internal control real-time polymerase chain reaction was evaluated for its ability to identify inaccurate quantitative Xpert MTB/RIF results. Assays with an internal control Ct greater than 34 were likely to be inaccurately quantitated; this represented 15% of M. tuberculosis-positive tests. Excluding these, decreasing M. tuberculosis Ct was associated with increasing smear microscopy grade for smears of concentrated sputum pellets (r(s) = -0.77) and directly from sputum (r(s) =-0.71). A Ct cutoff of approximately 27.7 best predicted smear-positive status. The association between M. tuberculosis Ct and time-to-detection in liquid culture (r(s) = 0.68) and semiquantitative colony counts (r(s) = -0.56) was weaker than smear. Tests of paired same-patient sputum showed that high viscosity sputum samples contained ×32 more M. tuberculosis than nonviscous samples. Comparisons between the grade of the acid-fast bacilli smear and Xpert MTB/RIF quantitative data across study sites enabled us to identify a site outlier in microscopy. Xpert MTB/RIF quantitation offers a new, standardized approach to measuring bacterial burden in the sputum of patients with tuberculosis.

  7. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  8. Quantitative risk assessment of WSSV transmission through partial harvesting and transport practices for shrimp aquaculture in Mexico.

    PubMed

    Sanchez-Zazueta, Edgar; Martínez-Cordero, Francisco Javier; Chávez-Sánchez, María Cristina; Montoya-Rodríguez, Leobardo

    2017-10-01

    This quantitative risk assessment provided an analytical framework to estimate white spot syndrome virus (WSSV) transmission risks in the following different scenarios: (1) partial harvest from rearing ponds and (2) post-harvest transportation, assuming that the introduction of contaminated water with viral particles into shrimp culture ponds is the main source of viral transmission risk. Probabilities of infecting shrimp with waterborne WSSV were obtained by approaching the functional form that best fits (likelihood ratio test) published data on the dose-response relationship for WSSV orally inoculated through water into shrimp. Expert opinion defined the ranges for the following uncertain factors: (1) the concentrations of WSSV in the water spilled from the vehicles transporting the infected shrimp, (2) the total volume of these spills, and (3) the dilution into culture ponds. Multiple scenarios were analysed, starting with a viral load (VL) of 1×10 2 mL -1 in the contaminated water spilled that reached the culture pond, whose probability of infection of an individual shrimp (P i ) was negligible (1.7×10 -7 ). Increasing the VL to 1×10 4.5 mL -1 and 1×10 7 mL -1 yielded results into very low (P i =5.3×10 -5 ) and high risk (P i =1.6×10 -2 ) categories, respectively. Furthermore, different pond stocking density (SD) scenarios (20 and 30 post-larvae [PL]/m 2 ) were evaluated, and the probability of infection of at least one out of the total number of shrimp exposed (P N ) was derived; for the scenarios with a low VL (1×10 2 mL -1 ), the P N remained at a negligible risk level (P N , 2.4×10 -7 to 1.8×10 -6 ). For most of the scenarios with the moderate VL (1×10 4.5 mL -1 ), the P N scaled up to a low risk category (P N , 1.1×10 -4 to 5.6×10 -4 ), whereas for the scenarios with a high VL (1×10 7 mL -1 ), the risk levels were high (P N , 2.3×10 -2 to 3.5×10 -2 ) or very high (P N , 1.1×10 -1 to 1.6×10 -1 ) depending on the volume of contaminated water

  9. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF EDC RISK MANAGEMENT METHODS

    EPA Science Inventory

    In Superfund risk management research, the performance of risk management techniques is typically evaluated by measuring "the concentrations of the chemicals of concern before and after risk management efforts. However, using bioassays and chemical data provides a more robust und...

  10. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  11. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  12. Risk evaluation and mitigation strategies: a focus on belatacept.

    PubMed

    Sam, Teena; Gabardi, Steven; Tichy, Eric M

    2013-03-01

    To review the elements and components of the risk evaluation and mitigation strategies (REMS) for the costimulation blocker belatacept and associated implications for health care providers working with transplant recipients. The MEDLINE and EMBASE databases (January 1990 to March 2012) were searched by using risk evaluation and mitigation strategies, REMS, belatacept, and organ transplant as search terms (individual organs were also searched). Retrieved articles were supplemented with analysis of information obtained from the Federal Register, the Food and Drug Administration, and the manufacturer of belatacept. REMS are risk-management strategies implemented to ensure that a product's benefits outweigh its known safety risks. Although belatacept offers a novel strategy in maintenance immunosuppression and was associated with superior renal function compared with cyclosporine in phase 2 and 3 trials, belatacept is also associated with increased risk of posttransplant lymphoproliferative disorder and central nervous system infections. The Food and Drug Administration required development of a REMS program as part of belatacept's approval process to ensure safe and appropriate use of the medication and optimization of its risk-benefit profile. Elements of the belatacept REMS include a medication guide that must be dispensed with each infusion and a communication plan. In the management of a complex population of patients, it is essential that those who care for transplant recipients, and patients, recognize the implications of potential and known risks of belatacept. The REMS program aims to facilitate careful selection and education of patients and vigilant monitoring.

  13. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  15. Two criteria for evaluating risk prediction models

    PubMed Central

    Pfeiffer, R.M.; Gail, M.H.

    2010-01-01

    SUMMARY We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF(q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF(p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF(q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF(p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data. PMID:21155746

  16. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  17. The influence of graphic format on breast cancer risk communication.

    PubMed

    Schapira, Marilyn M; Nattinger, Ann B; McAuliffe, Timothy L

    2006-09-01

    Graphic displays can enhance quantitative risk communication. However, empiric data regarding the effect of graphic format on risk perception is lacking. We evaluate the effect of graphic format elements on perceptions of risk magnitude and perceived truth of data. Preferences for format also were assessed. Participants (254 female primary care patients) viewed a series of hypothetical risk communications regarding the lifetime risk of breast cancer. Identical numeric risk information was presented using different graphic formats. Risk was perceived to be of lower magnitude when communicated with a bar graph as compared with a pictorial display (p < 0.0001), or with consecutively versus randomly highlighted symbols in a pictorial display (p = 0.0001). Data were perceived to be more true when presented with random versus consecutive highlights in a pictorial display (p < 0.01). A pictorial display was preferred to a bar graph format for the presentation of breast cancer risk estimates alone (p = 0.001). When considering breast cancer risk in comparison to heart disease, stroke, and osteoporosis, however, bar graphs were preferred pictorial displays (p < 0.001). In conclusion, elements of graphic format used to convey quantitative risk information effects key domains of risk perception. One must be cognizant of these effects when designing risk communication strategies.

  18. Perception of risks from electromagnetic fields: A psychometric evaluation of a risk-communication approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacGregor, D.G.; Slovic, P.; Morgan, M.G.

    Potential health risks from exposure to power-frequency electromagnetic fields (EMF) have become an issue of significant public concern. This study evaluates a brochure designed to communicate EMF health risks from a scientific perspective. The study utilized a pretest-posttest design in which respondents judged various sources of EMF (and other) health and safety risks, both before reaching the brochure and after. Respondents assessed risks on dimensions similar to those utilized in previous studies of risk perception. In addition, detailed ratings were made that probed respondents' beliefs about the possible causal effects of EMF exposure. The findings suggest that naive beliefs aboutmore » the potential of EMF exposure to cause harm were highly influenced by specific content elements of the brochure. The implications for using risk-communication approaches based on communicating scientific uncertainty are discussed. 19 refs., 1 fig., 11 tabs.« less

  19. Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature.

    PubMed

    Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier

    2016-02-01

    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens

  20. Quantitative evaluation of toothbrush and arm-joint motion during tooth brushing.

    PubMed

    Inada, Emi; Saitoh, Issei; Yu, Yong; Tomiyama, Daisuke; Murakami, Daisuke; Takemoto, Yoshihiko; Morizono, Ken; Iwasaki, Tomonori; Iwase, Yoko; Yamasaki, Youichi

    2015-07-01

    It is very difficult for dental professionals to objectively assess tooth brushing skill of patients, because an obvious index to assess the brushing motion of patients has not been established. The purpose of this study was to quantitatively evaluate toothbrush and arm-joint motion during tooth brushing. Tooth brushing motion, performed by dental hygienists for 15 s, was captured using a motion-capture system that continuously calculates the three-dimensional coordinates of object's motion relative to the floor. The dental hygienists performed the tooth brushing on the buccal and palatal sides of their right and left upper molars. The frequencies and power spectra of toothbrush motion and joint angles of the shoulder, elbow, and wrist were calculated and analyzed statistically. The frequency of toothbrush motion was higher on the left side (both buccal and palatal areas) than on the right side. There were no significant differences among joint angle frequencies within each brushing area. The inter- and intra-individual variations of the power spectrum of the elbow flexion angle when brushing were smaller than for any of the other angles. This study quantitatively confirmed that dental hygienists have individual distinctive rhythms during tooth brushing. All arm joints moved synchronously during brushing, and tooth brushing motion was controlled by coordinated movement of the joints. The elbow generated an individual's frequency through a stabilizing movement. The shoulder and wrist control the hand motion, and the elbow generates the cyclic rhythm during tooth brushing.