Science.gov

Sample records for quantitative risk evaluation

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  3. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention.

    PubMed

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2010-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan's current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 x 10(-8) (95th percentile: 3.20 x 10(-7)). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  4. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  5. A quantitative approach for integrating multiple lines of evidence for the evaluation of environmental health risks

    PubMed Central

    Schleier III, Jerome J.; Marshall, Lucy A.; Davis, Ryan S.

    2015-01-01

    Decision analysis often considers multiple lines of evidence during the decision making process. Researchers and government agencies have advocated for quantitative weight-of-evidence approaches in which multiple lines of evidence can be considered when estimating risk. Therefore, we utilized Bayesian Markov Chain Monte Carlo to integrate several human-health risk assessment, biomonitoring, and epidemiology studies that have been conducted for two common insecticides (malathion and permethrin) used for adult mosquito management to generate an overall estimate of risk quotient (RQ). The utility of the Bayesian inference for risk management is that the estimated risk represents a probability distribution from which the probability of exceeding a threshold can be estimated. The mean RQs after all studies were incorporated were 0.4386, with a variance of 0.0163 for malathion and 0.3281 with a variance of 0.0083 for permethrin. After taking into account all of the evidence available on the risks of ULV insecticides, the probability that malathion or permethrin would exceed a level of concern was less than 0.0001. Bayesian estimates can substantially improve decisions by allowing decision makers to estimate the probability that a risk will exceed a level of concern by considering seemingly disparate lines of evidence. PMID:25648367

  6. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... HUMAN SERVICES Food and Drug Administration Use of Influenza Disease Models To Quantitatively Evaluate... public workshop entitled: ``Use of Influenza Disease Models to Quantitatively Evaluate the Benefits and... hypothetical influenza vaccine, and to seek from a range of experts, feedback on the current version of...

  7. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  8. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    PubMed

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  9. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications. PMID:24043593

  10. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  11. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  12. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data

  13. Development of quantitative risk acceptance criteria

    SciTech Connect

    Griesmeyer, J. M.; Okrent, D.

    1981-01-01

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  14. Rat- and human-based risk estimates of lung cancer from occupational exposure to poorly-soluble particles: A quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Kuempel, E. D.; Smith, R. J.; Dankovic, D. A.; Stayner, L. T.

    2009-02-01

    In risk assessment there is a need for quantitative evaluation of the capability of animal models to predict disease risks in humans. In this paper, we compare the rat- and human-based excess risk estimates for lung cancer from working lifetime exposures to inhaled poorly-soluble particles. The particles evaluated include those for which long-term dose-response data are available in both species, i.e., coal dust, carbon black, titanium dioxide, silica, and diesel exhaust particulate. The excess risk estimates derived from the rat data were generally lower than those derived from the human studies, and none of the rat- and human-based risk estimates were significantly different (all p-values>0.05). Residual uncertainty in whether the rat-based risk estimates would over- or under-predict the true excess risks of lung cancer from inhaled poorly-soluble particles in humans is due in part to the low power of the available human studies, limited particle size exposure data for humans, and ambiguity about the best animal models and extrapolation methods.

  15. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  16. Screening Risk Evaluation methodology

    SciTech Connect

    Hopper, K.M.

    1994-06-01

    The Screening Risk Evaluation (SRE) Guidance document is a set of guidelines provided for the uniform implementation of SREs performed on D&D facilities. These guidelines are designed specifically for the completion of the second (semi-quantitative screening) phase of the D&D Risk-Based Process. The SRE Guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the risk to human health and the environment from ongoing or probable releases within a one year time period. The Worker Exposure Index (WEI) calculates the risk to workers, occupants, and visitors in D&D facilities of contaminant exposure. The Future Release Index (FRI) calculates the risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risk-to human health due to factors other than that of contaminants. The index of Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, determined on a project by project basis. The SRE is the first and most important step in the overall D&D project level decision making process.

  17. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  18. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  19. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in

  20. [Quantitative evaluation of health risk associated with occupational inhalation exposure to vinyl chloride at production plants in Poland].

    PubMed

    Szymczak, W

    1997-01-01

    Vinyl chloride is classified by the IARC in group 1-human carcinogens. In Poland occupational exposure to vinyl chloride is found among workers employed in many branches of industry, among others in the industry of vinyl chloride synthesis and polymerization as well as in the plastics, footwear, rubber, pharmaceutical and metallurgical industries. Concentrations observed range from the noon-determinable level to 90 mg/m3, at the MAC value equal to 5 mg/m3. Neoplasm of liver is a major carcinogenic effect of vinyl chloride. Hence, the health assessment focused on this critical risk. Four different linear dose-response models, developed by several authors and based on results of different epidemiological studies, were used to characterise the extent of cancer risk depending on the level of vinyl chloride concentrations. The estimated risk related to a forty-year employment under exposure equal to MAC values (5 mg/m3) fell within the range from 2.9.10(-4) to 2.6.10(-3). As the figures depict it did not exceed the acceptable level (10(-3)). PMID:9273438

  1. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  2. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  3. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. PMID:26456933

  4. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  5. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  6. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  7. Validation of biological markers for quantitative risk assessment.

    PubMed Central

    Schulte, P; Mazzuckelli, L F

    1991-01-01

    The evaluation of biological markers is recognized as necessary to the future of toxicology, epidemiology, and quantitative risk assessment. For biological markers to become widely accepted, their validity must be ascertained. This paper explores the range of considerations that compose the concept of validity as it applies to the evaluation of biological markers. Three broad categories of validity (measurement, internal study, and external) are discussed in the context of evaluating data for use in quantitative risk assessment. Particular attention is given to the importance of measurement validity in the consideration of whether to use biological markers in epidemiologic studies. The concepts developed in this presentation are applied to examples derived from the occupational environment. In the first example, measurement of bromine release as a marker of ethylene dibromide toxicity is shown to be of limited use in constructing an accurate quantitative assessment of the risk of developing cancer as a result of long-term, low-level exposure. This example is compared to data obtained from studies of ethylene oxide, in which hemoglobin alkylation is shown to be a valid marker of both exposure and effect. PMID:2050067

  8. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  9. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  10. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  11. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  12. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  13. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. PMID:19945145

  16. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  17. [QUANTITATIVE DNA EVALUATION OF THE HIGH CARCINOGENIC RISK OF HUMAN PAPILLOMA VIRUSES AND HUMAN HERPES VIRUSES IN MALES WITH FERTILITY DISORDERS].

    PubMed

    Evdokimov, V V; Naumenko, V A; Tulenev, Yu A; Kurilo, L F; Kovalyk, V P; Sorokina, T M; Lebedeva, A L; Gomberg, M A; Kushch, A A

    2016-01-01

    Infertility is an actual medical and social problem. In 50% of couples it is associated with the male factor and in more than 50% of cases the etiology of the infertility remains insufficiently understood. The goal of this work was to study the prevalence and to perform quantitative analysis of the human herpes viruses (HHV) and high carcinogenic risk papilloma viruses (HR HPV) in males with infertility, as well as to assess the impact of these infections on sperm parameters. Ejaculate samples obtained from 196 males fall into 3 groups. Group 1 included men with the infertility of unknown etiology (n = 112); group 2, patients who had female partners with the history of spontaneous abortion (n = 63); group 3 (control), healthy men (n = 21). HHV and HR HPV DNA in the ejaculates were detected in a total of 42/196 (21.4%) males: in 31 and 11 patients in groups 1 and 2, respectively (p > 0.05) and in none of healthy males. HHV were detected in 24/42; HR HPV, in 18/42 males (p > 0.05) without significant difference between the groups. Among HR HPV genotypes of the clade A9 in ejaculate were more frequent (14/18, p = 0.04). Comparative analysis of the sperm parameters showed that in the ejaculates of the infected patients sperm motility as well as the number of morphologically normal cells were significantly reduced compared with the healthy men. The quantification of the viral DNA revealed that in 31% of the male ejaculates the viral load was high: > 3 Ig10/100000 cells. Conclusion. The detection of HHV and HR HPV in the ejaculate is associated with male infertility. Quantification of the viral DNA in the ejaculate is a useful indicator for monitoring viral infections in infertility and for decision to start therapy. PMID:27451497

  18. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  19. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  20. A Quantitative Framework to Evaluate Proarrhythmic Risk in a First-in-Human Study to Support Waiver of a Thorough QT Study.

    PubMed

    Nelson, C H; Wang, L; Fang, L; Weng, W; Cheng, F; Hepner, M; Lin, J; Garnett, C; Ramanathan, S

    2015-12-01

    The effects of GS-4997 (apoptosis signal-regulating kinase 1 inhibitor) on cardiac repolarization were evaluated using a systematic modeling approach in a first-in-human (FIH) study. High quality, intensive, time-matched 12-lead electrocardiograms (ECGs) were obtained in this placebo-controlled, single and multiple-ascending dose study in healthy subjects. Model development entailed linearity and hysteresis assessments; GS-4997/metabolite concentration vs. baseline-adjusted QTcF (ΔQTcF) relationships were determined using linear mixed effects models. Bootstrapping was used to obtain 90% confidence intervals (CIs) of predicted placebo-corrected ΔQTcF (ΔΔQTcF). The upper bound of 90% CI for predicted ΔΔQTcF was <10 msec at therapeutic and supratherapeutic GS-4997/metabolite levels, indicating the absence of a QT prolongation effect. Model performance/suitability was assessed using sensitivity/specificity analyses and diagnostic evaluations. This comprehensive methodology, supported by clinical pharmacology characteristics, was deemed adequate to assess the proarrhythmic risk of GS-4997/metabolite by the US Food and Drug Administration and European Medicines Agency resulting in a successful waiver from a dedicated thorough QT (TQT) study. PMID:26259519

  1. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  2. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  3. Significance of quantitative enzyme-linked immunosorbent assay (ELISA) results in evaluation of three ELISAs and Western blot tests for detection of antibodies to human immunodeficiency virus in a high-risk population.

    PubMed Central

    Nishanian, P; Taylor, J M; Korns, E; Detels, R; Saah, A; Fahey, J L

    1987-01-01

    The characteristics of primary (first) tests with three enzyme-linked immunosorbent assay (ELISA) kits for human immunodeficiency virus (HIV) antibody were determined. The three ELISAs were performed on 3,229, 3,130, and 685 specimens from high-risk individuals using the Litton (LT; Litton Bionetics Laboratory Products, Charleston, S.C.), Dupont (DP; E. I. du Pont de Nemours & Co., Inc., Wilmington, Del.), and Genetic Systems (GS; Genetic Systems, Seattle, Wash.) kits, respectively. Evaluation was based on the distribution of quantitative test results (such as optical densities), a comparison with Western blot (WB) results, reproducibility of the tests, and identification of seroconverters. The performances of the GS and the DP kits were good by all four criteria and exceeded that of the LT kit. Primary ELISA-negative results were not always confirmed with repeat ELISA and by WB testing. The largest percentage of these unconfirmed negative test results came from samples with quantitative results in the fifth percentile nearest the cutoff. Thus, supplementary testing was indicated for samples with test results in this borderline negative range. Similarly, borderline positive primary ELISA results that were quantitatively nearest (fifth percentile) the cutoff value were more likely to be antibody negative on supplementary testing than samples with high antibody values. In this study, results of repeated tests by GS ELISA showed the least change from first test results. DP ELISA showed more unconfirmed primary positive test results, and LT ELISA showed more unconfirmed primary negative test results. Designation of a specimen with a single ELISA quantitative level near the cutoff value as positive or negative should be viewed with skepticism. A higher than normal proportion of specimens with high negative optical densities by GS ELISA (fifth percentile nearest the cutoff) and also negative by WB were found to be from individuals in the process of seroconversion. PMID

  4. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    PubMed

    Liu, Yao; Li, Jianying; Liu, Xiaoyong; Liu, Xudong; Khawar, Waqaar; Zhang, Xinyan; Wang, Fan; Chen, Xiaoxin; Sun, Zheng

    2015-01-01

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC). Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance), and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma). The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102), oral leukoplakia (OLK) patients (n=82), and OSCC patients (n=93), and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR) which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM) was found to be optimal with a high sensitivity (median>0.98) and specificity (median>0.99). With the SVM model, we constructed an oral cancer risk index (OCRI) which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations. PMID:25978541

  5. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  6. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  7. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  8. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  9. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  10. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  11. DIETARY RISK EVALUATION SYSTEM: DRES

    EPA Science Inventory

    The Dietary Risk Evaluation System (DRES) estimates exposure to pesticides in the diet by combining information concerning residues on raw agricultural commodities with information on consumption of those commodities. It then compares the estimated exposure level to a toxicologi...

  12. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  13. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  14. Molecular sensitivity threshold of wet mount and an immunochromatographic assay evaluated by quantitative real-time PCR for diagnosis of Trichomonas vaginalis infection in a low-risk population of childbearing women.

    PubMed

    Leli, Christian; Castronari, Roberto; Levorato, Lucia; Luciano, Eugenio; Pistoni, Eleonora; Perito, Stefano; Bozza, Silvia; Mencacci, Antonella

    2016-06-01

    Vaginal trichomoniasis is a sexually transmitted infection caused by Trichomonas vaginalis, a flagellated protozoan. Diagnosis of T. vaginalis infection is mainly performed by wet mount microscopy, with a sensitivity ranging from 38% to 82%, compared to culture, still considered the gold standard. Commercial immunochromatographic tests for monoclonal-antibody-based detection have been introduced as alternative methods for diagnosis of T. vaginalis infection and have been reported in some studies to be more sensitive than wet mount. Real-time PCR methods have been recently developed, with optimal sensitivity and specificity. The aim of this study was to evaluate whether there is a molecular sensitivity threshold for both wet mount and imunochromatographic assays. To this aim, a total of 1487 low-risk childbearing women (median age 32 years, interquartile range 27-37) were included in the study, and underwent vaginal swab for T. vaginalis detection by means of a quantitative real-time PCR assay, wet mount and an immunochromatographic test. Upon comparing the results, prevalence values observed were 1.3% for real-time PCR, 0.5% for microscopic examination, and 0.8% for the immunochromatographic test. Compared to real-time PCR, wet mount sensitivity was 40% (95% confidence interval 19.1% to 63.9%) and specificity was 100% (95% CI 99.7% to 100%). The sensitivity and specificity of the immunochromatographic assay were 57.9% (95% CI 33.5% to 79.8%) and 99.9% (95% CI 99.6% to 100%), respectively. Evaluation of the wet mount results and those of immunochromatographic assay detection in relation to the number of T. vaginalis DNA copies detected in vaginal samples showed that the lower identification threshold for both wet mount (chi-square 6.1; P = 0.016) and the immunochromatographic assay (chi-square 10.7; P = 0.002) was ≥100 copies of T. vaginalis DNA/5 mcl of eluted DNA. PMID:27367320

  15. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes. PMID:18657068

  16. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level. PMID:26202064

  17. Production Risk Evaluation Program (PREP) - summary

    SciTech Connect

    Kjeldgaard, E.A.; Saloio, J.H.; Vannoni, M.G.

    1997-03-01

    Nuclear weapons have been produced in the US since the early 1950s by a network of contractor-operated Department of Energy (DOE) facilities collectively known as the Nuclear Weapon Complex (NWC). Recognizing that the failure of an essential process might stop weapon production for a substantial period of time, the DOE Albuquerque Operations office initiated the Production Risk Evaluation Program (PREP) at Sandia National Laboratories (SNL) to assess quantitatively the potential for serious disruptions in the NWC weapon production process. PREP was conducted from 1984-89. This document is an unclassified summary of the effort.

  18. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  19. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  20. A Method for Quantitatively Evaluating a University Library Collection

    ERIC Educational Resources Information Center

    Golden, Barbara

    1974-01-01

    The acquisitions department of the University of Nebraska at Omaha library conducted a quantitative evaluation of the library's book collection in relation to the course offerings of the university. (Author/LS)

  1. Quantitative Evaluation of Management Courses: Part 1

    ERIC Educational Resources Information Center

    Cunningham, Cyril

    1973-01-01

    The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)

  2. A Program to Evaluate Quantitative Analysis Unknowns

    ERIC Educational Resources Information Center

    Potter, Larry; Brown, Bruce

    1978-01-01

    Reports on a computer batch program that will not only perform routine grading using several grading algorithms, but will also calculate various statistical measures by which the class performance can be evaluated and cumulative data collected. ( Author/CP)

  3. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  4. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety. PMID:20055976

  5. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  6. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  7. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  8. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  9. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese. PMID:26162789

  10. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. PMID:24300215

  11. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  12. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  13. Evaluating microbiological risks of biosolids land application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The transmission of pathogens by land application of untreated human and animal wastes has been known for more than 100 years. The QMRA (quantitative microbial risk assessment) process involves four basic steps: pathogen identification, exposure assessment, dose-response and risk characterization. ...

  14. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  15. Quantitative and Public Perception of Landslide Risk in Badulla, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Gunasekera, R.; Bandara, R. M. S.; Mallawatantri, A.; Saito, K.

    2009-04-01

    Landslides are often triggered by intense precipitation and are exacerbated by increased urbanisation and human activity. There is a significant risk of large scale landslides in Sri Lanka and when they do occur, they have the potential to cause devastation to property, lives and livelihoods. There are several high landslide risk areas in seven districts (Nuwara Eliya, Badulla, Ratnapura, Kegalle, Kandy, Matale and Kalutara) in Sri Lanka. These are also some of the poorest areas in the country and consequently the recovery process after catastrophic landslides become more problematic. Therefore landslide risk management is an important concern in poverty reduction strategies. We focused on the district of Badulla, Sri Lanka to evaluate the a) quantitative scientific analysis of landslide risk and b) qualitative public perception of landslides in the area. Combining high resolution, hazard and susceptibility data we quantified the risk of landslides in the area. We also evaluated the public perception of landslides in the area using participatory GIS techniques. The evaluation of public perception of landslide risk has been complemented by use of Landscan data. The framework of the methodology for Landscan data is based on using the second order administrative population data from census, each 30 arc-second cell within the administrative units receives a probability coefficient based on slope, proximity to roads and land cover. Provision of this information from these complementary methods to the regional planners help to strengthen the disaster risk reduction options and improving sustainable land use practices through enhanced public participation in the decision making and governance processes.

  16. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  17. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  18. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  19. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  20. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units. PMID:17276591

  1. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  2. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  3. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures. PMID:22296573

  4. Approach for evaluating inundation risks in urban drainage systems.

    PubMed

    Zhu, Zhihua; Chen, Zhihe; Chen, Xiaohong; He, Peiying

    2016-05-15

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers. Hence, inundation evaluation is becoming increasingly important around the world. This comprehensive assessment involves numerous indices in urban catchments, but the high-dimensional and non-linear relationship between the indices and the risk presents an enormous challenge for accurate evaluation. Therefore, an approach is hereby proposed to qualitatively and quantitatively evaluate inundation risks in urban drainage systems based on a storm water management model, the projection pursuit method, the ordinary kriging method and the K-means clustering method. This approach is tested using a residential district in Guangzhou, China. Seven evaluation indices were selected and twenty rainfall-runoff events were used to calibrate and validate the parameters of the rainfall-runoff model. The inundation risks in the study area drainage system were evaluated under different rainfall scenarios. The following conclusions are reached. (1) The proposed approach, without subjective factors, can identify the main driving factors, i.e., inundation duration, largest water flow and total flood amount in this study area. (2) The inundation risk of each manhole can be qualitatively analyzed and quantitatively calculated. There are 1, 8, 11, 14, 21, and 21 manholes at risk under the return periods of 1-year, 5-years, 10-years, 20-years, 50-years and 100-years, respectively. (3) The areas of levels III, IV and V increase with increasing rainfall return period based on analyzing the inundation risks for a variety of characteristics. (4) The relationships between rainfall intensity and inundation-affected areas are revealed by a logarithmic model. This study proposes a novel and successful approach to assessing risk in urban drainage systems and provides guidance for improving urban drainage systems and inundation preparedness. PMID:26897578

  5. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping. PMID:15876211

  6. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  7. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  8. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  9. INTEGRATED QUANTITATIVE CANCER RISK ASSESSMENT OF INORGANIC ARSENIC

    EPA Science Inventory

    This paper attempts to make an integrated risk assessment of arsenic, using data on humans exposed to arsenic via inhalation and ingestion. he data useful for making an integrated analysis and data gaps are discussed. rsenic provides a rare opportunity to compare the cancer risk ...

  10. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  11. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  12. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  13. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  14. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  15. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-01-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  16. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-09-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  17. Factors Distinguishing between Achievers and At Risk Students: A Qualitative and Quantitative Synthesis

    ERIC Educational Resources Information Center

    Eiselen, R.; Geyser, H.

    2003-01-01

    The purpose of this article is to identify factors that distinguish between Achievers and At Risk Students in Accounting 1A, and to explore how qualitative and quantitative research methods complement each other. Differences between the two groups were explored from both a quantitative and a qualitative perspective, focusing on study habits,…

  18. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  19. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  20. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  1. Cumulative Aggregate Risk Evaluation Software

    EPA Science Inventory

    CARES is a state-of-the-art software program designed to conduct complex exposure and risk assessments for pesticides, such as the assessments required under the 1996 Food Quality Protection Act (FQPA). CARES was originally developed under the auspices of CropLife America (CLA),...

  2. Risk effectiveness evaluation of surveillance testing

    SciTech Connect

    Martorell, S.; Kim, I.S.; Samanta, P.K.; Vesely, W.E.

    1992-07-20

    In nuclear power plants surveillance tests are required to detect failures in standby safety system components as a means of assuring their availability in case of an accident. However, the performance of surveillance tests at power may have adverse impact on safety as evidenced by the operating experience of the plants. The risk associated with a test includes two different aspects: (1) a positive aspect, i.e., risk contribution detected by the test, that results from the detection of failures which occur between tests and are detected by the test, and (2) a negative aspect, i.e., risk contribution caused by the test, that includes failures and degradations which are caused by the test or are related to the performance of the test. In terms of the two different risk contributions, the risk effectiveness of a test can be simply defined as follows: a test is risk effective if the risk contribution detected by the test is greater than the risk contribution caused by the test; otherwise it is risk ineffective. The methodology presentation will focus on two important kinds of negative test risk impacts, that is, the risk impacts of test-caused transients and equipment wear-out. The evaluation results of the risk effectiveness of the test will be presented in the full paper along with the risk assessment methodology and the insights from the sensitivity analysis. These constitute the core of the NUREG/CR-5775.

  3. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  4. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  5. Assessment Tools for the Evaluation of Risk

    EPA Science Inventory

    ASTER (Assessment Tools for the Evaluation of Risk) was developed by the U.S. EPA Mid-Continent Ecology Division, Duluth, MN to assist regulators in performing ecological risk assessments. ASTER is an integration of the ECOTOXicology Database (ECOTOX; EVALUATION AND EFFECTIVE RISK COMMUNICATION WORKSHOP PROCEEDINGS

    EPA Science Inventory

    To explore a number of questions in the area of risk communications, the Interagency Task Force on Environmental Cancer and Heart and Lung Disease, held a Workshop on Evaluation and Effective Risk Communication which brought together experts from academia, government agencies, an...

  6. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  7. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  8. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue. PMID:14678210

  9. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  10. Risk evaluation of medical and industrial radiation devices

    SciTech Connect

    Jones, E.D.; Cunningham, R.E.; Rathbun, P.A.

    1994-03-01

    In 1991, the NRC, Division of Industrial and Medical Nuclear Safety, began a program to evaluate the use of probabilistic risk assessment (PRA) in regulating medical devices. This program represents an initial step in an overall plant to evaluate the use of PRA in regulating the use of nuclear by-product materials. The NRC envisioned that the use of risk analysis techniques could assist staff in ensuring that the regulatory approach was standardized, understandable, and effective. Traditional methods of assessing risk in nuclear power plants may be inappropriate to use in assessing the use of by-product devices. The approaches used in assessing nuclear reactor risks are equipment-oriented. Secondary attention is paid to the human component, for the most part after critical system failure events have been identified. This paper describes the risk methodology developed by Lawrence Livermore National Laboratory (LLNL), initially intended to assess risks associated with the use of the Gamma Knife, a gamma stereotactic radiosurgical device. For relatively new medical devices such as the Gamma Knife, the challenge is to perform a risk analysis with very little quantitative data but with an important human factor component. The method described below provides a basic approach for identifying the most likely risk contributors and evaluating their relative importance. The risk analysis approach developed for the Gamma Knife and described in this paper should be applicable to a broader class of devices in which the human interaction with the device is a prominent factor. In this sense, the method could be a prototypical model of nuclear medical or industrial device risk analysis.

  11. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  12. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated. PMID:21940164

  13. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  14. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  15. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  16. Predicting pathogen risks to aid beach management: the real value of quantitative microbial risk assessment (QMRA).

    PubMed

    Ashbolt, Nicholas J; Schoen, Mary E; Soller, Jeffrey A; Roser, David J

    2010-09-01

    There has been an ongoing dilemma for agencies that set criteria for safe recreational waters in how to provide for a seasonal assessment of a beach site versus guidance for day-to-day management. Typically an overall 'safe' criterion level is derived from epidemiologic studies of sewage-impacted beaches. The decision criterion is based on a percentile value for a single sample or a moving median of a limited number (e.g. five per month) of routine samples, which are reported at least the day after recreator exposure has occurred. The focus of this paper is how to better undertake day-to-day recreational site monitoring and management. Internationally, good examples exist where predictive empirical regression models (based on rainfall, wind speed/direction, etc.) may provide an estimate of the target faecal indicator density for the day of exposure. However, at recreational swimming sites largely impacted by non-sewage sources of faecal indicators, there is concern that the indicator-illness associations derived from studies at sewage-impacted beaches may be inappropriate. Furthermore, some recent epidemiologic evidence supports the relationship to gastrointestinal (GI) illness with qPCR-derived measures of Bacteroidales/Bacteroides spp. as well as more traditional faecal indicators, but we understand less about the environmental fate of these molecular targets and their relationship to bather risk. Modelling pathogens and indicators within a quantitative microbial risk assessment framework is suggested as a way to explore the large diversity of scenarios for faecal contamination and hydrologic events, such as from waterfowl, agricultural animals, resuspended sediments and from the bathers themselves. Examples are provided that suggest that more site-specific targets derived by QMRA could provide insight, directly translatable to management actions. PMID:20638095

  17. The Neglected Side of the Coin: Quantitative Benefit-risk Analyses in Medical Imaging.

    PubMed

    Zanzonico, Pat B

    2016-03-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the "linear no-threshold" (LNT) dose-response model. PMID:26808890

  18. Noninvasive Risk Stratification of Lung Adenocarcinoma using Quantitative Computed Tomography

    PubMed Central

    Raghunath, Sushravya; Maldonado, Fabien; Rajagopalan, Srinivasan; Karwoski, Ronald A.; DePew, Zackary S.; Bartholmai, Brian J.; Peikert, Tobias; Robb, Richard A.

    2014-01-01

    Introduction Lung cancer remains the leading cause of cancer-related deaths in the US and worldwide. Adenocarcinoma is the most common type of lung cancer and encompasses lesions with widely variable clinical outcomes. In the absence of noninvasive risk stratification, individualized patient management remains challenging. Consequently a subgroup of pulmonary nodules of the lung adenocarcinoma spectrum is likely treated more aggressively than necessary. Methods Consecutive patients with surgically resected pulmonary nodules of the lung adenocarcinoma spectrum (lesion size ≤ 3 cm, 2006–2009) and available pre-surgical high-resolution computed tomography (HRCT) imaging were identified at Mayo Clinic Rochester. All cases were classified using an unbiased Computer-Aided Nodule Assessment and Risk Yield (CANARY) approach based on the quantification of pre-surgical HRCT characteristics. CANARY-based classification was independently correlated to postsurgical progression-free survival. Results CANARY analysis of 264 consecutive patients identified three distinct subgroups. Independent comparisons of 5-year disease-free survival (DFS) between these subgroups demonstrated statistically significant differences in 5-year DFS, 100%, 72.7% and 51.4%, respectively (p = 0.0005). Conclusions Non-invasive CANARY based risk stratification identifies subgroups of patients with pulmonary nodules of the adenocarcinoma spectrum characterized by distinct clinical outcomes. This technique may ultimately improve the current expert opinion-based approach to the management of these lesions by facilitating individualized patient management. PMID:25170645

  19. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  1. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  2. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  3. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  4. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  5. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  6. D & D screening risk evaluation guidance

    SciTech Connect

    Robers, S.K.; Golden, K.M.; Wollert, D.A.

    1995-09-01

    The Screening Risk Evaluation (SRE) guidance document is a set of guidelines provided for the uniform implementation of SREs performed on decontamination and decommissioning (D&D) facilities. Although this method has been developed for D&D facilities, it can be used for transition (EM-60) facilities as well. The SRE guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the current risk to human health and the environment, exterior to the building, from ongoing or probable releases within a one-year time period. The Worker Exposure Index (WEI) calculates the current risk to workers, occupants and visitors inside contaminated D&D facilities due to contaminant exposure. The Future Release Index (FRI) calculates the hypothetical risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risks to human health due to factors other than that of contaminants. Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form, and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, as determined on a project-by-project basis.

  7. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  8. Evaluation of a Virucidal Quantitative Carrier Test for Surface Disinfectants

    PubMed Central

    Rabenau, Holger F.; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  9. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  10. Quantitative relations between risk, return and firm size

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Stanley, H. E.

    2009-03-01

    We analyze —for a large set of stocks comprising four financial indices— the annual logarithmic growth rate R and the firm size, quantified by the market capitalization MC. For the Nasdaq Composite and the New York Stock Exchange Composite we find that the probability density functions of growth rates are Laplace ones in the broad central region, where the standard deviation σ(R), as a measure of risk, decreases with the MC as a power law σ(R)~(MC)- β. For both the Nasdaq Composite and the S&P 500, we find that the average growth rate langRrang decreases faster than σ(R) with MC, implying that the return-to-risk ratio langRrang/σ(R) also decreases with MC. For the S&P 500, langRrang and langRrang/σ(R) also follow power laws. For a 20-year time horizon, for the Nasdaq Composite we find that σ(R) vs. MC exhibits a functional form called a volatility smile, while for the NYSE Composite, we find power law stability between σ(r) and MC.

  11. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator

    PubMed Central

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J.; Zhang, Li-Qun

    2013-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke. PMID:21674395

  12. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  13. Quantitative risk assessment of FMD virus transmission via water.

    PubMed

    Schijven, Jack; Rijs, Gerard B J; de Roda Husman, Ana Maria

    2005-02-01

    Foot-and-mouth disease (FMD) is a viral disease of domesticated and wild cloven-hoofed animals. FMD virus is known to spread by direct contact between infected and susceptible animals, by animal products such as meat and milk, by the airborne route, and mechanical transfer on people, wild animals, birds, and by vehicles. During the outbreak of 2001 in the Netherlands, milk from dairy cattle was illegally discharged into the sewerage as a consequence of transport prohibition. This may lead to contaminated discharges of biologically treated and raw sewage in surface water that is given to cattle to drink. The objective of the present study was to assess the probability of infecting dairy cows that were drinking FMD virus contaminated surface water due to illegal discharges of contaminated milk. So, the following data were collected from literature: FMD virus inactivation in aqueous environments, FMD virus concentrations in milk, dilution in sewage water, virus removal by sewage treatment, dilution in surface water, water consumption of cows, size of a herd in a meadow, and dose-response data for ingested FMD virus by cattle. In the case of 1.6 x 10(2) FMD virus per milliliter in milk and discharge of treated sewage in surface water, the probability of infecting a herd of cows was estimated to be 3.3 x 10(-7) to 8.5 x 10(-5), dependent on dilution in the receiving surface water. In the case of discharge of raw sewage, all probabilities of infection were 100 times higher. In the case of little dilution in small rivers, the high level of 8.5 x 10(-3) is reached. For 10(4) times higher FMD virus concentrations in milk, the probabilities of infecting a herd of cows are high in the case of discharge of treated sewage (3.3 x 10(-3) to 5.7 x 10(-1)) and very high in the case of discharge of raw sewage (0.28-1.0). It can be concluded that illegal and uncontrolled discharges of contaminated milk into the sewerage system may lead to high risks to other cattle farms at 6-50 km

  14. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  15. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  16. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  17. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  18. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  19. Preoperative Evaluation: Estimation of Pulmonary Risk.

    PubMed

    Lakshminarasimhachar, Anand; Smetana, Gerald W

    2016-03-01

    Postoperative pulmonary complications (PPCs) are common after major non-thoracic surgery and associated with significant morbidity and high cost of care. A number of risk factors are strong predictors of PPCs. The overall goal of the preoperative pulmonary evaluation is to identify these potential, patient and procedure-related risks and optimize the health of the patients before surgery. A thorough clinical examination supported by appropriate laboratory tests will help guide the clinician to provide optimal perioperative care. PMID:26927740

  20. Abandoned metal mine stability risk evaluation.

    PubMed

    Bétournay, Marc C

    2009-10-01

    The abandoned mine legacy is critical in many countries around the world, where mine cave-ins and surface subsidence disruptions are perpetual risks that can affect the population, infrastructure, historical legacies, land use, and the environment. This article establishes abandoned metal mine failure risk evaluation approaches and quantification techniques based on the Canadian mining experience. These utilize clear geomechanics considerations such as failure mechanisms, which are dependent on well-defined rock mass parameters. Quantified risk is computed using probability of failure (probabilistics using limit-equilibrium factors of safety or applicable numerical modeling factor of safety quantifications) times a consequence impact value. Semi-quantified risk can be based on failure-case-study-based empirical data used in calculating probability of failure, and personal experience can provide qualified hazard and impact consequence assessments. The article provides outlines for land use and selection of remediation measures based on risk. PMID:19645755

  1. A risk methodology to evaluate sensitvity of plant risk to human errors

    SciTech Connect

    Samanta, P.; Wong, S.; Higgins, J.; Haber, S.; Luckas, W.

    1988-01-01

    This paper presents an evaluation of sensitivity of plant risk parameters, namely the core melt frequency and the accident sequence frequencies, to the human errors involved in various aspects of nuclear power plant operations. Results are provided using the Oconee-3 Probabilistic Risk Assessment model as an example application of the risk methodology described herein. Sensitivity analyses in probabilistic risk assessment (PRA) involve three areas: (1) a determination of the set of input parameters; in this case, various categories of human errors signifying aspects of plant operation, (2) the range over which the input parameters vary, and (3) an assessment of the sensitivity of the plant risk parameters to the input parameters which, in this case, consist of all postulated human errors, or categories of human errors. The methodology presents a categorization scheme where human errors are categorized in terms of types of activity, location, personnel involved, etc., to relate the significance of sensitivity of risk parameters to specific aspects of human performance in the nuclear plant. Ranges of variability for human errors have been developed considering the various known causes of uncertainty in human error probability estimates in PRAs. The sensitivity of the risk parameters are assessed using the event/fault tree methodology of the PRA. The results of the risk-based sensitivity evaluation using the Oconee-3 PRA as an example show the quantitative impact on the plant risk level due to variations in human error probabilities. The relative effects of various human error categories and human error sorts within the categories are also presented to identify and characterize significant human errors for effective risk management in nuclear power plant operational activities. 8 refs., 10 figs., 4 tabs.

  2. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. PMID:25542704

  3. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  4. Quantitative evaluation of atherosclerotic plaque phantom by near-infrared multispectral imaging with three wavelengths

    NASA Astrophysics Data System (ADS)

    Nagao, Ryo; Ishii, Katsunori; Awazu, Kunio

    2014-03-01

    Atherosclerosis is a primary cause of critical ischemic disease. The risk of critical event is involved the content of lipid in unstable plaque. Near-infrared (NIR) range is effective for diagnosis of atherosclerotic plaque because of the absorption peaks of lipid. NIR multispectral imaging (NIR-MSI) is suitable for the evaluation of plaque because it can provide spectroscopic information and spatial image quickly with a simple measurement system. The purpose of this study is to evaluate the lipid concentrations in plaque phantoms quantitatively with a NIR-MSI system. A NIR-MSI system was constructed with a supercontinuum light, a grating spectrometer and a MCT camera. Plaque phantoms with different concentrations of lipid were prepared by mixing bovine fat and a biological soft tissue model to mimic the different stages of unstable plaque. We evaluated the phantoms by the NIR-MSI system with three wavelengths in the band at 1200 nm. Multispectral images were processed by spectral angle mapper method. As a result, the lipid areas of phantoms were effectively highlighted by using three wavelengths. In addition, the concentrations of lipid areas were classified according to the similarity between measured spectra and a reference spectrum. These results suggested the possibility of image enhancement and quantitative evaluation of lipid in unstable plaque with a NIR-MSI.

  5. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods.

    PubMed

    Joly, Lauren E; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24-0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31-0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  6. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  7. Risk evaluation mitigation strategies: the evolution of risk management policy.

    PubMed

    Hollingsworth, Kristen; Toscani, Michael

    2013-04-01

    The United States Food and Drug Administration (FDA) has the primary regulatory responsibility to ensure that medications are safe and effective both prior to drug approval and while the medication is being actively marketed by manufacturers. The responsibility for safe medications prior to marketing was signed into law in 1938 under the Federal Food, Drug, and Cosmetic Act; however, a significant risk management evolution has taken place since 1938. Additional federal rules, entitled the Food and Drug Administration Amendments Act, were established in 2007 and extended the government's oversight through the addition of a Risk Evaluation and Mitigation Strategy (REMS) for certain drugs. REMS is a mandated strategy to manage a known or potentially serious risk associated with a medication or biological product. Reasons for this extension of oversight were driven primarily by the FDA's movement to ensure that patients and providers are better informed of drug therapies and their specific benefits and risks prior to initiation. This article provides an historical perspective of the evolution of medication risk management policy and includes a review of REMS programs, an assessment of the positive and negative aspects of REMS, and provides suggestions for planning and measuring outcomes. In particular, this publication presents an overview of the evolution of the REMS program and its implications. PMID:23113627

  8. A quantitative evaluation of models for Aegean crustal deformation

    NASA Astrophysics Data System (ADS)

    Nyst, M.; Thatcher, W.

    2003-04-01

    Modeling studies of eastern Mediterranean tectonics show that Aegean deformation is mainly determined by WSW directed expulsion of Anatolia and SW directed extension due to roll-back of African lithosphere along the Hellenic trench. How motion is transferred across the Aegean remains a subject of debate. The two most widely used hypotheses for Aegean tectonics assert fundamentally different mechanisms. The first model describes deformation as a result of opposing rotations of two rigid microplates separated by a zone of extension. In the second model most motion is accommodated by shear on a series of dextral faults and extension on graben systems. These models make different quantitative predictions for the crustal deformation field that can be tested by a new, spatially dense GPS velocity data set. To convert the GPS data into crustal deformation parameters we use different methods to model complementary aspects of crustal deformation. We parameterize the main fault and plate boundary structures of both models and produce representations for the crustal deformation field that range from purely rigid rotations of microplates, via interacting, elastically deforming blocks separated by crustal faults to a continuous velocity gradient field. Critical evaluation of these models indicates strengths and limitations of each and suggests new measurements for further refining understanding of present-day Aegean tectonics.

  9. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  10. [Effects of calcinogenic plants--qualitative and quantitative evaluation].

    PubMed

    Mello, J R; Habermehl, G G

    1998-01-01

    Different research methods demonstrated the presence of variable quantities of Vitamin D as well as its metabolites in calcinogenic plants. Most of the experiments indicated that the active component most probably should be the metabolite 1,25 (OH)2D3 linked as a glycoside. By this research it was achieved to evaluate the presence of elements with Vitamin D-like activity in the calcinogenic plants Solanum malacoxylon, Cestrum diurnum, Trisetum flavescens and Nierembergia veitchii by testing different extracts of the above plants by oral application to rachitic chicks within the research model "Strontium added Alimentation". After the oral administration of the extracts, the serum was analysed to determine the level of the elements calcium, phosphorus and alkaline phosphatase. The results gained with chicks demonstrated the presence of substances with Vitamin D-like activity in the 4 plants. Solanum malacoxylon and Cestrum diurnum as well contained substances of hydrosoluble character with elevated activity which was indicated by the significant high levels of calcium and phosphorus combined with a reduced activity of the alkaline phosphatase. This indicated the presence of 1,25 (OH)2D3 in both plants. The hydrosoluble character of the active substance in both plants is most probably explained as a compound of the metabolite 1,25 (OH)2D3, combined as a glycoside in the position O-25 of the molecule. Nierembergia veitchii and Trisetum flavescens contained only minor concentration of elements with hydrosoluble characteristics. The results of the 4 analysed plants were evaluated quantitatively as follows: Solanum malycoxylon--82,800 IU of Vitamin D/kg, Cestrum diurnum--63,200 IU of Vitamin D/kg, Nierembergia veitchii--16,400 IU/kg and Trisetum flavescens 12,000 Vitamin D IU/kg. All concentrations are calcinogenic. PMID:9499629

  11. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  12. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  13. Evaluating Potential Health Risks in Relocatable Classrooms.

    ERIC Educational Resources Information Center

    Katchen, Mark; LaPierre, Adrienne; Charlin, Cary; Brucker, Barry; Ferguson, Paul

    2001-01-01

    Only limited data exist describing potential exposures to chemical and biological agents when using portable classrooms or outlining how to assess and reduce associated health risks. Evaluating indoor air quality involves examining ventilating rates, volatile organic compounds, and microbiologicals. Open communication among key stakeholders is…

  14. Evaluation of health risks for contaminated aquifers.

    PubMed Central

    Piver, W T; Jacobs, T L; Medina, M A

    1997-01-01

    This review focuses on progress in the development of transport models for heterogeneous contaminated aquifers, the use of predicted contaminant concentrations in groundwater for risk assessment for heterogeneous human populations, and the evaluation of aquifer remediation technologies. Major limitations and areas for continuing research for all methods presented in this review are identified. Images Figure 2. PMID:9114282

  15. Using quantitative risk information in decisions about statins: a qualitative study in a community setting

    PubMed Central

    Polak, Louisa; Green, Judith

    2015-01-01

    Background A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. Aim To understand the role of quantitative risk information in patients’ accounts of decisions about taking statins. Design and setting This was a qualitative study, with participants recruited and interviewed in community settings. Method Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Results Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as ‘necessary’ either to treat test results, or because of personalised, unequivocal advice from a doctor. Conclusion This study’s findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. PMID:25824187

  16. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  17. QUANTITATIVE ASSESSMENT OF CANCER RISK FROM EXPOSURE TO DIESEL ENGINE EMISSIONS

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. his model accounted for rat-huma...

  18. The value of quantitative patient preferences in regulatory benefit-risk assessment

    PubMed Central

    Egbrink, Mart oude; IJzerman, Maarten

    2014-01-01

    Quantitative patient preferences are a method to involve patients in regulatory benefit-risk assessment. Assuming preferences can be elicited, there might be multiple advantages to their use. Legal, methodological and procedural issues do however imply that preferences are currently at most part of the solution on how to best involve patients in regulatory decision making. Progress is recently made on these issues.

  19. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  20. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and qualitative disclosures about market risk. 229.305 Section 229.305 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION STANDARD INSTRUCTIONS FOR FILING FORMS UNDER SECURITIES ACT OF 1933, SECURITIES EXCHANGE ACT OF 1934 AND...

  1. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ...The Food and Drug Administration (FDA) is announcing the availability of a draft report entitled ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review'' (literature review report). A literature review was conducted to address a requirement of the Patient Protection and Affordable Care Act (Affordable Care Act). FDA is publishing the literature review......

  2. Use of global sensitivity analysis in quantitative microbial risk assessment: application to the evaluation of a biological time temperature integrator as a quality and safety indicator for cold smoked salmon.

    PubMed

    Ellouze, M; Gauchi, J-P; Augustin, J-C

    2011-06-01

    The aim of this study was to apply a global sensitivity analysis (SA) method in model simplification and to evaluate (eO)®, a biological Time Temperature Integrator (TTI) as a quality and safety indicator for cold smoked salmon (CSS). Models were thus developed to predict the evolutions of Listeria monocytogenes and the indigenous food flora in CSS and to predict TTIs endpoint. A global SA was then applied on the three models to identify the less important factors and simplify the models accordingly. Results showed that the subset of the most important factors of the three models was mainly composed of the durations and temperatures of two chill chain links, out of the control of the manufacturers: the domestic refrigerator and the retail/cabinet links. Then, the simplified versions of the three models were run with 10(4) time temperature profiles representing the variability associated to the microbial behavior, to the TTIs evolution and to the French chill chain characteristics. The results were used to assess the distributions of the microbial contaminations obtained at the TTI endpoint and at the end of the simulated profiles and proved that, in the case of poor storage conditions, the TTI use could reduce the number of unacceptable foods by 50%. PMID:21511136

  3. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited

  4. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  5. Quantitative comparison between crowd models for evacuation planning and evaluation

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.

    2014-02-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.

  6. Aging rat vestibular ganglion: I. Quantitative light microscopic evaluation.

    PubMed

    Alidina, A; Lyon, M J

    1990-01-01

    This study was undertaken to quantify age-related changes in the rat vestibular ganglion. Cell number, diameter, and proximal-distal distribution based on size were evaluated. Serial 5-microns plastic sections of the vestibular ganglion from 15 female Wistar rats were examined. Rats were divided into three age groups: young (Y, 3 to 5 months, n = 5), old (0, 24 to 26 months, n = 3), and very old (VO, 28 to 31 months, n = 7). Quantitative analysis indicated no significant differences (P less than .05) in the estimated number of ganglion cells (mean: Y = 1,690, 0 = 2,257, VO = 1,678), ganglion cell profile diameters (mean: Y = 22.5 microns, n = 2,886; O = 23.7 microns, n = 2,313; VO = 22.8 microns, n = 4,061), or proximal-distal localization (proximal: 22.3 microns, 24.4 microns, 22.7 microns; middle: 22.6 microns, 23.1 microns, 22.4 microns; distal: 23.3 microns, 23.4 microns, 23.7 microns; Y, O, and VO, respectively). When pooled, the old animals tended to have slightly larger cell profiles than the other groups. We noted a dramatic age-related increase of aging pigment within the ganglion cell profiles, making the old and very old animals easily distinguishable from the young. In most of the cell profiles, the aging pigment was more or less uniformly distributed throughout the cytoplasm. However, in some, aging pigment was accumulated at one pole of the cell profile. While no typical degenerating cellular profiles were found in any of the sections, several of the ganglion cell profiles from the old animals revealed dense cytoplasm, possibly indicating an early stage of degeneration. PMID:2382785

  7. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  8. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  9. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  10. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  11. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. PMID:26739372

  12. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  13. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  14. Quantitative evaluation of stiffness of commercial suture materials.

    PubMed

    Chu, C C; Kizil, Z

    1989-03-01

    The bending stiffness of 22 commercial suture materials of varying size, chemical structure and physical form was quantitatively evaluated using a stiffness tester (Taber V-5, model 150B, Teledyne). The commercial sutures were Chromic catgut; Dexon (polyglycolic acid); Vicryl (polyglactin 910); PDS (polydioxanone); Maxon (polyglycolide-trimethylene carbonate); Silk (coated with silicone); Mersilene (polyester fiber); Tycron (polyester fiber); Ethibond (polyethylene terephthalate coated with polybutylene); Nurolon (nylon 66); Surgilon (nylon 66 coated with silicone); Ethilon (coated nylon 66), Prolene (polypropylene); Dermalene (polyethylene), and Gore-tex (polytetraflouroethylene). These are both natural and synthetic, absorbable and nonabsorbable and monofilament and multifilament sutures. All of these sutures were size 2-0, but Prolene sutures with sizes ranging from 1-0 to 9-0 were also tested to determine the effect of suture size on stiffness. The bending stiffness data obtained showed that a wide range of bending stiffness was observed among the 22 commercial sutures. The most flexible 2-0 suture was Gore-tex, followed by Dexon, Silk, Surgilon, Vicryl (uncoated), Tycron, Nurolon, Mersilene, Ethibond, Maxon, PDS, Ethilon, Prolene, Chromic catgut, coated Vicryl, and lastly, Dermalene. The large porous volume inherent in Gore-tex monofilament suture was the reason for its lowest flexural stiffness. Sutures with a braided structure were generally more flexible than those of a monofilament structure, irrespective of the chemical constituents. Coated sutures had significantly higher stiffness than the corresponding uncoated ones. This is particularly true when polymers rather than wax were used as the coating material. This increase in stiffness is attributable to the loss of mobility under bending force in the fibers and yarns that make up the sutures. An increase in the size of the suture significantly increased the stiffness, and the magnitude of increase

  15. Quantitative Evaluation of Atherosclerotic Plaque Using Ultrasound Tissue Characterization.

    NASA Astrophysics Data System (ADS)

    Yigiter, Ersin

    Evaluation of therapeutic methods directed toward interrupting and/or delaying atherogenesis is impeded by the lack of a reliable, non-invasive means for monitoring progression or regression of disease. The ability to characterize the predominant component of plaque may be very valuable in the study of this disease's natural history. The earlier the lesion, the more likely is lipid to be the predominant component. Progression of plaque is usually by way of overgrowth of fibrous tissues around the fatty pool. Calcification is usually a feature of the older or complicated lesion. To explore the feasibility of using ultrasound to characterize plaque we have conducted measurements of the acoustical properties of various atherosclerotic lesions found in freshly excised samples of human abdominal aorta. Our objective has been to determine whether or not the acoustical properties of plaque correlate with the type and/or chemical composition of plaque and, if so, to define a measurement scheme which could be done in-vivo and non-invasively. Our current data base consists of individual tissue samples from some 200 different aortas. Since each aorta yields between 10 to 30 tissue samples for study, we have data on some 4,468 different lesions or samples. Measurements of the acoustical properties of plaque were found to correlate well with the chemical composition of plaque. In short, measurements of impedance and attenuation seem sufficient to classify plaque as to type and to composition. Based on the in-vitro studies, the parameter of attenuation was selected as a means of classifying the plaque. For these measurements, an intravascular ultrasound scanner was modified according to our specifications. Signal processing algorithms were developed which would analyze the complex ultrasound waveforms and estimate tissue properties such as attenuation. Various methods were tried to estimate the attenuation from the pulse-echo backscattered signal. Best results were obtained by

  16. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates. PMID:27357043

  17. QUANTITATIVE GENETIC ACTIVITY GRAPHICAL PROFILES FOR USE IN CHEMICAL EVALUATION

    EPA Science Inventory

    A graphic approach termed a Genetic Activity Profile (GAP) has been developed to display a matrix of data on the genetic and related effects of selected chemical agents. he profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each...

  18. Quantitative, Notional, and Comprehensive Evaluations of Spontaneous Engaged Speech

    ERIC Educational Resources Information Center

    Molholt, Garry; Cabrera, Maria Jose; Kumar, V. K.; Thompsen, Philip

    2011-01-01

    This study provides specific evidence regarding the extent to which quantitative measures, common sense notional measures, and comprehensive measures adequately characterize spontaneous, although engaged, speech. As such, the study contributes to the growing body of literature describing the current limits of automatic systems for evaluating…

  19. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process. PMID:21549039

  20. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: II. Risk characterization.

    PubMed

    Pouillot, Régis; Goulet, Véronique; Delignette-Muller, Marie Laure; Mahé, Aurélie; Cornu, Marie

    2009-06-01

    A model for the assessment of exposure to Listeria monocytogenes from cold-smoked salmon consumption in France was presented in the first of this pair of articles (Pouillot et al., 2007, Risk Analysis, 27:683-700). In the present study, the exposure model output was combined with an internationally accepted hazard characterization model, adapted to the French situation, to assess the risk of invasive listeriosis from cold-smoked salmon consumption in France in a second-order Monte Carlo simulation framework. The annual number of cases of invasive listeriosis due to cold-smoked salmon consumption in France is estimated to be 307, with a very large credible interval ([10; 12,453]), reflecting data uncertainty. This uncertainty is mainly associated with the dose-response model. Despite the significant uncertainty associated with the predictions, this model provides a scientific base for risk managers and food business operators to manage the risk linked to cold-smoked salmon contaminated with L. monocytogenes. Under the modeling assumptions, risk would be efficiently reduced through a decrease in the prevalence of L. monocytogenes or better control of the last steps of the cold chain (shorter and/or colder storage during the consumer step), whereas reduction of the initial contamination levels of the contaminated products and improvement in the first steps of the cold chain do not seem to be promising strategies. An attempt to apply the recent risk-based concept of FSO (food safety objective) on this example underlines the ambiguity in practical implementation of the risk management metrics and the need for further elaboration on these concepts. PMID:19220799

  1. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  2. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  3. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. PMID:26227185

  4. Evaluating risk propensity using an objective instrument.

    PubMed

    Sueiro Abad, Manuel J; Sánchez-Iglesias, Ivan; Moncayo de Tella, Alejandra

    2011-05-01

    Risk propensity is the stable tendency to choose options with a lower probability of success, but greater rewards. Its evaluation has been approached from various perspectives: from self-report questionnaires to objective tests. Self-report questionnaires have often been criticized due to interference from voluntary and involuntary biases, in addition to their lack of predictive value. Objective tests, on the other hand, require resources that make them difficult to administer to large samples. This paper presents an easy-to-administer, 30-item risk propensity test. Each item is itself an objective test describing a hypothetical situation in which the subject must choose between three options, each with a different gain function but equivalent in expected value. To assess its psychometric fit, the questionnaire was administered to 222 subjects, and we performed a test of its reliability as well as exploratory factor analysis. The results supported a three-factor model of risk (Sports and Gambling, Long-term Plans, and Loss Management). After making the necessary adjustments and incorporating a global factor of risk propensity, confirmatory factor analysis was done, revealing that the data exhibited adequate goodness of fit. PMID:21568196

  5. Quantitative risk assessment of Listeriosis due to consumption of raw milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objectives of this study were to estimate the risk of illnesses for raw milk consumers due to L. monocytogenes contamination in raw milk sold by permitted raw milk dealers, and the risk of listeriosis for people on farms who consume raw milk. Three scenarios were evaluated for raw milk sold by ...

  6. A quantitative assessment of risks of heavy metal residues in laundered shop towels and their use by workers.

    PubMed

    Connor, Kevin; Magee, Brian

    2014-10-01

    This paper presents a risk assessment of exposure to metal residues in laundered shop towels by workers. The concentrations of 27 metals measured in a synthetic sweat leachate were used to estimate the releasable quantity of metals which could be transferred to workers' skin. Worker exposure was evaluated quantitatively with an exposure model that focused on towel-to-hand transfer and subsequent hand-to-food or -mouth transfers. The exposure model was based on conservative, but reasonable assumptions regarding towel use and default exposure factor values from the published literature or regulatory guidance. Transfer coefficients were derived from studies representative of the exposures to towel users. Contact frequencies were based on assumed high-end use of shop towels, but constrained by a theoretical maximum dermal loading. The risk estimates for workers developed for all metals were below applicable regulatory risk benchmarks. The risk assessment for lead utilized the Adult Lead Model and concluded that predicted lead intakes do not constitute a significant health hazard based on potential worker exposures. Uncertainties are discussed in relation to the overall confidence in the exposure estimates developed for each exposure pathway and the likelihood that the exposure model is under- or overestimating worker exposures and risk. PMID:24973502

  7. [Quantitative evaluation of the nitroblue tetrazolium reduction test].

    PubMed

    Vagner, V K; Nasonkin, O S; Boriskina, N D

    1989-01-01

    The results of NBT test were assessed by the visual cytochemical method and by the quantitative spectrophotometry technique developed by the authors for the NBT test. The results demonstrate a higher sensitivity and informative value of the new method, in case the neutrophilic tetrazolium activity is rather high; this recommends the NBT test spectrophotometric variant for wide clinical application in studies of the blood leukocyte functional and metabolic activity. PMID:2483198

  8. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  9. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  10. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible

  11. Quantitative ultrasound does not identify patients with an inflammatory disease at risk of vertebral deformities

    PubMed Central

    Heijckmann, A Caroline; Dumitrescu, Bianca; Nieuwenhuijzen Kruseman, Arie C; Geusens, Piet; Wolffenbuttel, Bruce HR; De Vries, Jolanda; Drent, Marjolein; Huijberts, Maya SP

    2008-01-01

    Background Previous studies from our group have shown that a high prevalence of vertebral deformities suggestive of fracture can be found in patients with an inflammatory disease, despite a near normal bone mineral density (BMD). As quantitative ultrasound (QUS) of the heel can be used for refined assessment of bone strength, we evaluated whether QUS can be used to identify subjects with an inflammatory disease with an increased chance of having a vertebral fracture. Methods 246 patients (mean age: 44 ± 12.4 years) with an inflammatory disease (sarcoidosis or inflammatory bowel disease (IBD)) were studied. QUS of the heel and BMD of the hip (by dual X-ray absorptiometry (DXA)) were measured. Furthermore lateral single energy densitometry of the spine for assessment of vertebral deformities was done. Logistic regression analysis was performed to assess the strength of association between the prevalence of a vertebral deformity and BMD and QUS parameters, adjusted for gender and age. Results Vertebral deformities (ratio of <0.80) were found in 72 vertebrae of 54 subjects (22%). In contrast to the QUS parameters BUA (broadband ultrasound attenuation) and SOS (speed of sound), T-score of QUS and T-scores of the femoral neck and trochanter (DXA) were lower in the group of patients with vertebral deformities. Logistic regression analysis showed that the vertebral deformity risk increases by about 60 to 90% per 1 SD reduction of BMD (T-score) determined with DXA but not with QUS. Conclusion Our findings imply that QUS measurements of the calcaneus in patients with an inflammatory condition, such as sarcoidosis and IBD, are likely of limited value to identify patients with a vertebral fracture. PMID:18492278

  12. Evaluation of the "Respect Not Risk" Firearm Safety Lesson for 3rd-Graders

    ERIC Educational Resources Information Center

    Liller, Karen D.; Perrin, Karen; Nearns, Jodi; Pesce, Karen; Crane, Nancy B.; Gonzalez, Robin R.

    2003-01-01

    The purpose of this study was to evaluate the MORE HEALTH "Respect Not Risk" Firearm Safety Lesson for 3rd-graders in Pinellas County, Florida. Six schools representative of various socioeconomic levels were selected as the test sites. Qualitative and quantitative data were collected. A total of 433 matched pretests/posttests were used to…

  13. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  14. Evaluation of residue drum storage safety risks

    SciTech Connect

    Conner, W.V.

    1994-06-17

    A study was conducted to determine if any potential safety problems exist in the residue drum backlog at the Rocky Flats Plant. Plutonium residues stored in 55-gallon drums were packaged for short-term storage until the residues could be processed for plutonium recovery. These residues have now been determined by the Department of Energy to be waste materials, and the residues will remain in storage until plans for disposal of the material can be developed. The packaging configurations which were safe for short-term storage may not be safe for long-term storage. Interviews with Rocky Flats personnel involved with packaging the residues reveal that more than one packaging configuration was used for some of the residues. A tabulation of packaging configurations was developed based on the information obtained from the interviews. A number of potential safety problems were identified during this study, including hydrogen generation from some residues and residue packaging materials, contamination containment loss, metal residue packaging container corrosion, and pyrophoric plutonium compound formation. Risk factors were developed for evaluating the risk potential of the various residue categories, and the residues in storage at Rocky Flats were ranked by risk potential. Preliminary drum head space gas sampling studies have demonstrated the potential for formation of flammable hydrogen-oxygen mixtures in some residue drums.

  15. Designs for Risk Evaluation and Management

    SciTech Connect

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.

  16. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  17. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    PubMed

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health. PMID:27065414

  18. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates. PMID:23508143

  19. Evaluation of reference genes for quantitative RT-PCR in Lolium perenne

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative real-time RT-PCR provides an important tool for analyzing gene expression if proper internal standards are used. The aim of this study was to identify and evaluate reference genes for use in real-time quantitative RT-PCR in perennial ryegrass (Lolium perenne L.) during plant developmen...

  20. Benchmarking on the evaluation of major accident-related risk assessment.

    PubMed

    Fabbri, Luciano; Contini, Sergio

    2009-03-15

    This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union. PMID:18657363

  1. Anesthesia and the quantitative evaluation of neurovascular coupling

    PubMed Central

    Masamoto, Kazuto; Kanno, Iwao

    2012-01-01

    Anesthesia has broad actions that include changing neuronal excitability, vascular reactivity, and other baseline physiologies and eventually modifies the neurovascular coupling relationship. Here, we review the effects of anesthesia on the spatial propagation, temporal dynamics, and quantitative relationship between the neural and vascular responses to cortical stimulation. Previous studies have shown that the onset latency of evoked cerebral blood flow (CBF) changes is relatively consistent across anesthesia conditions compared with variations in the time-to-peak. This finding indicates that the mechanism of vasodilation onset is less dependent on anesthesia interference, while vasodilation dynamics are subject to this interference. The quantitative coupling relationship is largely influenced by the type and dosage of anesthesia, including the actions on neural processing, vasoactive signal transmission, and vascular reactivity. The effects of anesthesia on the spatial gap between the neural and vascular response regions are not fully understood and require further attention to elucidate the mechanism of vascular control of CBF supply to the underlying focal and surrounding neural activity. The in-depth understanding of the anesthesia actions on neurovascular elements allows for better decision-making regarding the anesthetics used in specific models for neurovascular experiments and may also help elucidate the signal source issues in hemodynamic-based neuroimaging techniques. PMID:22510601

  2. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  3. Risk Evaluation, Driving, and Adolescents: A Typology.

    ERIC Educational Resources Information Center

    Harre, Niki

    2000-01-01

    Presents a typology outlining five psychological risk states that may be experienced by adolescent drivers. Identifies the habitually cautious driving and active risk avoidance states as desirable from a traffic safety viewpoint. Identifies reduced risk perception, acceptance of risk at a cost, and risk seeking states as undesirable. Examines…

  4. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  5. A New Simple Interferometer for Obtaining Quantitatively Evaluable Flow Patterns

    NASA Technical Reports Server (NTRS)

    Erdmann, S F

    1953-01-01

    The method described in the present report makes it possible to obtain interferometer records with the aid of any one of the available Schlieren optics by the addition of very simple expedients, which fundamentally need not to be inferior to those obtained by other methods, such as the Mach-Zehnder interferometer, for example. The method is based on the fundamental concept of the phase-contrast process developed by Zernike, but which in principle has been enlarged to such an extent that it practically represents an independent interference method for general applications. Moreover, the method offers the possibility, in case of necessity, of superposing any apparent wedge field on the density field to be gauged. The theory is explained on a purely physical basis and illustrated and proved by experimental data. A number of typical cases are cited and some quantitative results reported.

  6. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  7. The Children, Youth, and Families at Risk (CYFAR) Evaluation Collaboration.

    ERIC Educational Resources Information Center

    Marek, Lydia I.; Byrne, Richard A. W.; Marczak, Mary S.; Betts, Sherry C.; Mancini, Jay A.

    1999-01-01

    The Cooperative Extension Service's Children, Youth, and Families at Risk initiative is being assessed by the Evaluation Collaboration's three projects: state-strengthening evaluation project (resources to help states evaluate community programs); NetCon (evaluation of electronic and other networks); and National Youth at Risk Sustainability Study…

  8. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  9. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

  10. Designs for Risk Evaluation and Management

    Energy Science and Technology Software Center (ESTSC)

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakagemore » simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less

  11. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  12. Range sensors on marble surfaces: quantitative evaluation of artifacts

    NASA Astrophysics Data System (ADS)

    Guidi, Gabriele; Remondino, Fabio; Russo, Michele; Spinetti, Alessandro

    2009-08-01

    While 3D imaging systems are widely available and used, clear statements about the possible influence of material properties over the acquired geometrical data are still rather few. In particular a material very often used in Cultural Heritage is marble, known to give geometrical errors with range sensor technologies and whose entity reported in the literature seems to vary considerably in the different works. In this article a deep investigation with different types of active range sensors used on four types of marble surfaces, has been performed. Two triangulation-based active sensors employing laser stripe and white light pattern projection respectively, and one PW-TOF laser scanner have been used in the experimentation. The analysis gave rather different results for the two categories of instruments. A negligible light penetration came out from the triangulation-based equipment (below 50 microns with the laser stripe and even less with the pattern projection device), while with the TOF system this came out to be two orders of magnitude larger, quantitatively evidencing a source of systematic errors that any surveyor engaged in 3D scanning of Cultural Heritage sites and objects should take into account and correct.

  13. A lighting metric for quantitative evaluation of accent lighting systems

    NASA Astrophysics Data System (ADS)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  14. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  15. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  16. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  17. A quantitative evaluation of the AVITEWRITE model of handwriting learning.

    PubMed

    Paine, R W; Grossberg, S; Van Gemmert, A W A

    2004-12-01

    Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an inverse relation between curvature and speed. The adaptive vector integration to endpoint handwriting (AVITEWRITE) model of Grossberg and Paine (2000) [A neural model of corticocerebellar interactions during attentive imitation and predictive learning of sequential handwriting movements. Neural Networks, 13, 999-1046] addressed how such complex movements may be learned through attentive imitation. The model suggested how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. Key psychophysical and neural data about learning to make curved movements were simulated, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size scaling with isochrony, and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a two-thirds power law relation between angular velocity and curvature. However, the model learned from letter trajectories of only one subject, and only qualitative kinematic comparisons were made with previously published human data. The present work describes a quantitative test of AVITEWRITE through direct comparison of a corpus of human handwriting data with the model's performance when it learns by tracing the human trajectories. The results show that model performance was variable across the subjects, with an average correlation between the model and human data of 0.89+/-0.10. The present data from simulations using the AVITEWRITE model

  18. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  19. Photoacoustic microscopy for quantitative evaluation of angiogenesis inhibitor

    NASA Astrophysics Data System (ADS)

    Chen, Sung-Liang; Burnett, Joseph; Sun, Duxin; Xie, Zhixing; Wang, Xueding

    2014-03-01

    We present the photoacoustic microscopy (PAM) for evaluation of angiogenesis inhibitors on a chick embryo model. Microvasculature in the chorioallantoic membrane (CAM) of the chick embryos was imaged by PAM, and the optical microscopy (OM) images of the same set of CAMs were also acquired for comparisons, serving for validation of the results from PAM. The angiogenesis inhibitors, Sunitinib, with different concentrations applied to the CAM result in the change in microvascular density, which was quantified by both PAM and OM imaging. Similar change in microvascular density from PAM and OM imaging in response to angiogenesis inhibitor at different doses was observed, demonstrating that PAM has potential to provide objective evaluation of anti-angiogenesis medication. Besides, PAM is advantageous in three-dimensional and functional imaging compared with OM so that the emerging PAM technique may offer unique information on the efficacy of angiogenesis inhibitors and could benefit applications related to antiangiogenesis treatments.

  20. Quantitative evaluation of phonetograms in the case of functional dysphonia.

    PubMed

    Airainer, R; Klingholz, F

    1993-06-01

    According to the laryngeal clinical findings, figures making up a scale were assigned to vocally trained and vocally untrained persons suffering from different types of functional dysphonia. The different types of dysphonia--from the manifested hypofunctional to the extreme hyperfunctional dysphonia--were classified by means of this scale. Besides, the subjects' phonetograms were measured and approximated by three ellipses, what rendered possible the definition of phonetogram parameters. The combining of selected phonetogram parameters to linear combinations served the purpose of a phonetographic evaluation. The linear combinations were to bring phonetographic and clinical evaluations into correspondence as accurately as possible. It was necessary to use different kinds of linear combinations for male and female singers and nonsingers. As a result of the reclassification of 71 and the new classification of 89 patients, it was possible to graduate the types of functional dysphonia by means of computer-aided phonetogram evaluation with a clinically acceptable error rate. This method proved to be an important supplement to the conventional diagnostics of functional dysphonia. PMID:8353627

  1. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  2. How does the general public evaluate risk information? The impact of associations with other risks.

    PubMed

    Visschers, Vivianne H M; Meertens, Ree M; Passchier, Wim F; Devries, Nanne K

    2007-06-01

    There is a considerable body of knowledge about the way people perceive risks using heuristics and qualitative characteristics, and about how risk information should be communicated to the public. However, little is known about the way people use the perception of known risks (associated risks) to judge an unknown risk. In a first, qualitative study, six different risks were discussed in in-depth interviews and focus group interviews. The interviews showed that risk associations played a prominent role in forming risk perceptions. Associated risks were often mentioned spontaneously. Second, a survey study was conducted to confirm the importance of risk associations quantitatively. This study investigated whether people related unknown risks to known risks. This was indeed confirmed. Furthermore, some insight was gained into how and why people form risk associations. Results showed that the semantic category of the unknown risks was more important in forming associations than the perceived level of risk or specific risk characteristics. These findings were in line with the semantic network theory. Based on these two studies, we recommend using the mental models approach in developing new risk communications. PMID:17640218

  3. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  4. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    NASA Astrophysics Data System (ADS)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  5. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  6. RISK MANAGEMENT EVALUATION FOR CONCENTRATED ANIMAL FEEDING OPERATIONS

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) developed a Risk Management Evaluation (RME) to provide information needed to help plan future research in the Laboratory dealing with the environmental impact of concentrated animal feeding operations (CAFOs). Agriculture...

  7. Consistencies and inconsistencies underlying the quantitative assessment of leukemia risk from benzene exposure

    SciTech Connect

    Lamm, S.H.; Walters, A.S. ); Wilson, R. ); Byrd, D.M. ); Grunwald, H. )

    1989-07-01

    This paper examines recent risk assessments for benzene and observes a number of inconsistencies within the study and consistencies between studies that should effect the quantitative determination of the risk from benzene exposure. Comparisons across studies show that only acute myeloid leukemia (AML) is found to be consistently in excess with significant benzene exposure. The data from the Pliofilm study that forms the basis of most quantitative assessments reveal that all the AML cases came from only one of the three studied plants and that all the benzene exposure data came from the other plants. Hematological data from the 1940s from the plant from which almost all of the industrial hygiene exposure data come do not correlate well with the originally published exposure estimates but do correlate well with an alternative set of exposure estimates that are much greater than those estimates originally published. Temporal relationships within the study are not consistent with those of other studies. The dose-response relationship is strongly nonlinear. Other data suggest that the leukemogenic effect of benzene is nonlinear and may derive from a threshold toxicity.

  8. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  9. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R; Murphy, Stephen M; Day, Robert H; Bence, A Edward; Neff, Jerry M; Wiens, John A

    2012-03-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001-2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400-4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  10. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  11. GWAS implicates a role for quantitative immune traits and threshold effects in risk for human autoimmune disorders

    PubMed Central

    Gregersen, Peter K.; Diamond, Betty; Plenge, Robert M.

    2016-01-01

    Genome wide association studies in human autoimmune disorders has provided a long list of alleles with rather modest degrees of risk. A large fraction of these associations are likely due to either quantitative differences in gene expression or amino acid changes that regulate quantitative aspects of the immune response. While functional studies are still lacking for most of these associations, we present examples of autoimmune disease risk alleles that influence quantitative changes in lymphocyte activation, cytokine signaling and dendritic cell function. The analysis of immune quantitative traits associated with autoimmune loci is clearly going to be an important component of understanding the pathogenesis of autoimmunity. This will require both new and more efficient ways of characterizing the normal immune system, as well as large population resources with which genotype-phenotype correlations can be convincingly demonstrated. Future development of new therapies will depend on understanding the mechanistic underpinnings of immune regulation by these new risk loci. PMID:23026397

  12. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  13. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  14. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  15. A quantitative methodology to assess the risks to human health from CO2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, E.; Sitchler, A.; Maxwell, R. M.; McCray, J. E.

    2010-12-01

    Leakage of CO2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO2 leakage into drinking water aquifers. This framework incorporates the potential release of CO2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding given greater toxicity of lead at lower doses than arsenic. It was also found that higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and

  16. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  17. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  18. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  19. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. PMID:25951756

  20. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  1. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  2. What are the chances? Evaluating risk and benefit information in consumer health materials

    PubMed Central

    Burkell, Jacquelyn

    2004-01-01

    Much consumer health information addresses issues of disease risk or treatment risks and benefits, addressing questions such as “How effective is this treatment?” or “What is the likelihood that this test will give a false positive result?” Insofar as it addresses outcome likelihood, this information is essentially quantitative in nature, which is of critical importance, because quantitative information tends to be difficult to understand and therefore inaccessible to consumers. Information professionals typically examine reading level to determine the accessibility of consumer health information, but this measure does not adequately reflect the difficulty of quantitative information, including materials addressing issues of risk and benefit. As a result, different methods must be used to evaluate this type of consumer health material. There are no standard guidelines or assessment tools for this task, but research in cognitive psychology provides insight into the best ways to present risk and benefit information to promote understanding and minimize interpretation bias. This paper offers an interdisciplinary bridge that brings these results to the attention of information professionals, who can then use them to evaluate consumer health materials addressing risks and benefits. PMID:15098049

  3. Risk Evaluation of Endocrine-Disrupting Chemicals

    PubMed Central

    Gioiosa, Laura; Palanza, Paola; vom Saal, Frederick S.

    2015-01-01

    We review here our studies on early exposure to low doses of the estrogenic endocrine-disrupting chemical bisphenol A (BPA) on behavior and metabolism in CD-1 mice. Mice were exposed in utero from gestation day (GD) 11 to delivery (prenatal exposure) or via maternal milk from birth to postnatal day 7 (postnatal exposure) to 10 µg/kg body weight/d of BPA or no BPA (controls). Bisphenol A exposure resulted in long-term disruption of sexually dimorphic behaviors. Females exposed to BPA pre- and postnatally showed increased anxiety and behavioral profiles similar to control males. We also evaluated metabolic effects in prenatally exposed adult male offspring of dams fed (from GD 9 to 18) with BPA at doses ranging from 5 to 50 000 µg/kg/d. The males showed an age-related significant change in a number of metabolic indexes ranging from food intake to glucose regulation at BPA doses below the no observed adverse effect level (5000 µg/kg/d). Consistent with prior findings, low but not high BPA doses produced significant effects for many outcomes. These findings provide further evidence of the potential risks that developmental exposure to low doses of the endocrine disrupter BPA may pose to human health, with fetuses and infants being highly vulnerable. PMID:26740806

  4. EVALUATING TOOLS AND MODELS USED FOR QUANTITATIVE EXTRAPOLATION OF IN VITRO TO IN VIVO DATA FOR NEUROTOXICANTS*

    EPA Science Inventory

    There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...

  5. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  6. Assessment of Semi-Quantitative Health Risks of Exposure to Harmful Chemical Agents in the Context of Carcinogenesis in the Latex Glove Manufacturing Industry.

    PubMed

    Yari, Saeed; Fallah Asadi, Ayda; Varmazyar, Sakineh

    2016-01-01

    Excessive exposure to chemicals in the workplace can cause poisoning and various diseases. Thus, for the protection of labor, it is necessary to examine the exposure of people to chemicals and risks from these materials. The purpose of this study is to evaluate semi-quantitative health risks of exposure to harmful chemical agents in the context of carcinogenesis in a latex glove manufacturing industry. In this cross-sectional study, semi-quantitative risk assessment methods provided by the Department of Occupational Health of Singapore were used and index of LD50, carcinogenesis (ACGIH and IARC) and corrosion capacity were applied to calculate the hazard rate and the biggest index was placed as the basis of risk. To calculate the exposure rate, two exposure index methods and the actual level of exposure were employed. After identifying risks, group H (high) and E (very high) classified as high-risk were considered. Of the total of 271 only 39 (15%) were at a high risk level and 3% were very high (E). These risks only was relevant to 7 materials with only sulfuric acid placed in group E and 6 other materials in group H, including nitric acid (48.3%), chromic acid (6.9%), hydrochloric acid (10.3%), ammonia (3.4%), potassium hydroxide (20.7%) and chlorine (10.3%). Overall, the average hazard rate level was estimated to be 4 and average exposure rate to be 3.5. Health risks identified in this study showed that the manufacturing industry for latex gloves has a high level of risk because of carcinogens, acids and strong alkalisand dangerous drugs. Also according to the average level of risk impact, it is better that the safety design strategy for latex gloves production industry be placed on the agenda. PMID:27165227

  7. Quantitative evaluation of proximal contacts in posterior composite restorations. Part I. Methodology.

    PubMed

    Wang, J C; Hong, J M

    1989-07-01

    An in vivo method of quantitative measuring intertooth distance before and after placement of a Class 2 composite resin restoration has been developed. A Kaman Sciences KD-2611 non-contact displacement measuring system with a 1 U unshield sensor, based upon the variable resistance of eddy current, was used for the intraoral measurement. Quantitative evaluation of proximal wear, therefore, can be made preoperatively, postoperatively, and at subsequent recall interval for posterior composite resin restorations. PMID:2810447

  8. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  9. Quantitative Microbial Risk Assessment for Campylobacter spp. on Ham in Korea

    PubMed Central

    2015-01-01

    The objective of this study was to evaluate the risk of illness from Campylobacter spp. on ham. To identify the hazards of Campylobacter spp. on ham, the general characteristics and microbial criteria for Campylobacter spp., and campylobacteriosis outbreaks were investigated. In the exposure assessment, the prevalence of Campylobacter spp. on ham was evaluated, and the probabilistic distributions for the temperature of ham surfaces in retail markets and home refrigerators were prepared. In addition, the raw data from the Korea National Health and Nutrition Examination Survey (KNHNES) 2012 were used to estimate the consumption amount and frequency of ham. In the hazard characterization, the Beta-Poisson model for Campylobacter spp. infection was used. For risk characterization, a simulation model was developed using the collected data, and the risk of Campylobacter spp. on ham was estimated with @RISK. The Campylobacter spp. cell counts on ham samples were below the detection limit (<0.70 Log CFU/g). The daily consumption of ham was 23.93 g per person, and the consumption frequency was 11.57%. The simulated mean value of the initial contamination level of Campylobacter spp. on ham was −3.95 Log CFU/g, and the mean value of ham for probable risk per person per day was 2.20×10−12. It is considered that the risk of foodborne illness for Campylobacter spp. was low. Furthermore, these results indicate that the microbial risk assessment of Campylobacter spp. in this study should be useful in providing scientific evidence to set up the criteria of Campylobacter spp.. PMID:26761897

  10. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.