Science.gov

Sample records for quantitative risk evaluation

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  3. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  4. Evaluation of skeletal status by quantitative ultrasonometry in postmenopausal women without known risk factors for osteoporosis.

    PubMed

    Mandato, Vincenzo Dario; Sammartino, Annalidia; Di Carlo, Costantino; Tommaselli, Giovanni A; Tauchmanovà, Libuse; D'Elia, Antonio; Nappi, Carmine

    2005-09-01

    The objective of our study was to evaluate bone density in Italian postmenopausal women without clinical risk factors for osteoporosis resident in the Naples area using quantitative ultrasonometry of bone (QUS). Subjects were 1149 Italian postmenopausal women (age: 54.9 +/- 5.0 years (mean +/- standard deviation); range: 45-74 years) resident in the Naples area. Clinical risk factors for osteoporosis resulting in exclusion from the study were family history of osteoporosis, dietary, smoking and alcohol habits, personal history of fractures and/or metabolic diseases. The following QUS parameters were calculated: amplitude-dependent speed of sound (AD-SoS), T-score and Z-score. We found significant inverse correlations between AD-SoS and age (r = - 0.23), time since menopause (r = - 0.25) and body mass index (BMI) (r = - 0.16). The same was observed for T-score. In contrast, Z-score showed a significant positive correlation with age and time since menopause, and a negative correlation with BMI. A T-score suggestive of high risk for osteoporosis (less than -3.2) was found in 1.6% of subjects, while a T-score suggestive of moderate risk for osteoporosis (between -3.2 and -2) was found in 19.3% of patients. In this group of women without clinical risk factors for osteoporosis we found a very low prevalence of QUS results suggesting a high risk for osteoporosis. However, a condition of 'moderate' risk for osteoporosis was present in a remarkable percentage of these women.

  5. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  6. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... model used by the Center for Biologics Evaluation and Research (CBER) and suggestions for further...: Richard Forshee, Center for Biologics Evaluation and Research (HFM-210), Food and Drug Administration... disease computer simulation models to generate quantitative estimates of the benefits and risks...

  7. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  8. Quantitative Evaluation of the Mechanical Risks Caused by Focal Cartilage Defects in the Knee

    PubMed Central

    Venäläinen, Mikko S.; Mononen, Mika E.; Salo, Jari; Räsänen, Lasse P.; Jurvelin, Jukka S.; Töyräs, Juha; Virén, Tuomas; Korhonen, Rami K.

    2016-01-01

    Focal cartilage lesions can proceed to severe osteoarthritis or remain unaltered even for years. A method to identify high risk defects would be of utmost importance to guide clinical decision making and to identify the patients that are at the highest risk for the onset and progression of osteoarthritis. Based on cone beam computed tomography arthrography, we present a novel computational model for evaluating changes in local mechanical responses around cartilage defects. Our model, based on data obtained from a human knee in vivo, demonstrated that the most substantial alterations around the defect, as compared to the intact tissue, were observed in minimum principal (compressive) strains and shear strains. Both strain values experienced up to 3-fold increase, exceeding levels previously associated with chondrocyte apoptosis and failure of collagen crosslinks. Furthermore, defects at the central regions of medial tibial cartilage with direct cartilage-cartilage contact were the most vulnerable to loading. Also locations under the meniscus experienced substantially increased minimum principal strains. We suggest that during knee joint loading particularly minimum principal and shear strains are increased above tissue failure limits around cartilage defects which might lead to osteoarthritis. However, this increase in strains is highly location-specific on the joint surface. PMID:27897156

  9. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    PubMed

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  10. Quantitative microbiological risk assessment.

    PubMed

    Hoornstra, E; Notermans, S

    2001-05-21

    The production of safe food is being increasingly based on the use of risk analysis, and this process is now in use to establish national and international food safety objectives. It is also being used more frequently to guarantee that safety objectives are met and that such guarantees are achieved in a cost-effective manner. One part of the overall risk analysis procedure-risk assessment-is the scientific process in which the hazards and risk factors are identified, and the risk estimate or risk profile is determined. Risk assessment is an especially important tool for governments when food safety objectives have to be developed in the case of 'new' contaminants in known products or known contaminants causing trouble in 'new' products. Risk assessment is also an important approach for food companies (i) during product development, (ii) during (hygienic) process optimalization, and (iii) as an extension (validation) of the more qualitative HACCP-plan. This paper discusses these two different types of risk assessment, and uses probability distribution functions to assess the risks posed by Escherichia coli O157:H7 in each case. Such approaches are essential elements of risk management, as they draw on all available information to derive accurate and realistic estimations of the risk posed. The paper also discusses the potential of scenario-analysis in simulating the impact of different or modified risk factors during the consideration of new or improved control measures.

  11. Quantitative Risk - Phase 1

    DTIC Science & Technology

    2013-09-03

    040 Report No. SERC-2013-TR-040-2 Revised September 3, 2013 73 189. Mockus, A., Weiss, D., “Predicting Risk of Software Changes”, Bell Labs...the Council for International Organizations of Medical Sciences (CIOMS), 1998 238. Thompson, K., Graham , J., Zellner, J., “Risk-Benefit Analsis...International Journal on Engineering Performance-Based Fire Codes, Volume 6, Number 1, 2004 254. Bell , J., Holroyd, J., “Review of human reliability

  12. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  13. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas.

  14. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data

  15. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  16. Evaluating quantitative research reports.

    PubMed

    Russell, Cynthia L

    2005-01-01

    As a novice reviewer, it is often difficult to trust your evaluation of a research report. You may feel uncertain in your interpretations. These are common concerns and can be remedied by reading and discussing research reports on research listservs, through journal clubs, or with other nephrology nurses. Practice using the criteria for research report evaluation and you too can perfect critiquing a research report!

  17. A quantitative screening-level approach to incorporate chemical exposure and risk/safety into alternative assessment evaluations.

    PubMed

    Arnold, Scott M; Greggs, Bill; Goyak, Katy O; Landenberger, Bryce D; Mason, Ann M; Howard, Brett; Zaleski, Rosemary; Howard, Brett; Zaleski, Rosemary T

    2017-03-10

    As the general public and retailers ask for disclosure of chemical ingredients in the marketplace, a number of hazard screening tools were developed to evaluate the so called "greenness" of individual chemical ingredients and/or formulations. The majority of these tools focus only on hazard, often using chemical lists, ignoring the other part of the risk equation: exposure. Using a hazard-only focus can result in regrettable substitutions, changing one chemical ingredient for another that turns out to be more hazardous or shifts the toxicity burden to others. To minimize the incidents of regrettable substitutions, BizNGO describes 'Common Principles' to frame a process for informed substitution. Two of the six principles state reduce hazard and minimize exposure. A number of frameworks have emerged to evaluate and assess alternatives. One framework developed by leading experts under the auspices of the U.S. National Academy of Sciences recommended that hazard and exposure be specifically addressed in the same step when assessing candidate alternatives. For the alternative assessment community, this paper serves as an informational resource for considering exposure in an alternatives assessment using elements of problem formulation; product identity, use, and composition; hazard analysis; exposure analysis; and risk characterization. These conceptual elements build upon practices from government, academia, and industry and are exemplified through two hypothetical case studies demonstrating the questions asked and decisions faced in new product development. These two case studies - inhalation exposure to a generic paint product and environmental exposure to a shampoo rinsed down the drain - demonstrate the criteria, considerations, and methods required to combine exposure models addressing human health and environmental impacts to provide a screening level hazard/exposure (risk) analysis. This paper informs practices for these elements within a comparative risk

  18. Development of quantitative risk acceptance criteria

    SciTech Connect

    Griesmeyer, J. M.; Okrent, D.

    1981-01-01

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  19. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  20. Evaluating the effectiveness of pasteurization for reducing human illnesses from Salmonella spp. in egg products: results of a quantitative risk assessment.

    PubMed

    Latimer, Heejeong K; Marks, Harry M; Coleman, Margaret E; Schlosser, Wayne D; Golden, Neal J; Ebel, Eric D; Kause, Janell; Schroeder, Carl M

    2008-02-01

    As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.

  1. Risk Assessment: A Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Baert, K.; Francois, K.; de Meulenaer, B.; Devlieghere, F.

    A risk can be defined as a function of the probability of an adverse health effect and the severity of that effect, consequential to a hazard in food (Codex Alimentarius, 1999) . During a risk assessment, an estimate of the risk is obtained. The goal is to estimate the likelihood and the extent of adverse effects occurring to humans due to possible exposure(s) to hazards. Risk assessment is a scientifically based process consisting of the following steps: (1) hazard identification, (2) hazard characterization, (3) exposure assessment and (4) and risk characterization (Codex Alimentarius, 1999).

  2. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  3. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran

    PubMed Central

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran’s Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one. PMID:23574885

  4. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran.

    PubMed

    Nasrabadi, Touraj; Bidabadi, Niloufar Shirani

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran's Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one.

  5. Quantitative Risk - Phases 1 & 2

    DTIC Science & Technology

    2013-11-12

    Bell Labs Technical Journal, April–June 2000 190. Guszcza, J., “Session 2 – How to Build a Risk Based Analytical Model for Life Insurance...Safety Signals”, the Council for International Organizations of Medical Sciences (CIOMS), 1998 238. Thompson, K., Graham , J., Zellner, J., “Risk...Methodologies”, International Journal on Engineering Performance-Based Fire Codes, Volume 6, Number 1, 2004 254. Bell , J., Holroyd, J., “Review of

  6. Quantitative evaluation of hepatitis B virus mutations and hepatocellular carcinoma risk: a meta-analysis of prospective studies

    PubMed Central

    Yang, Yang; Sun, Jiang-Wei; Zhao, Long-Gang; Bray, Freddie

    2015-01-01

    Background The temporal relationship between hepatitis B virus (HBV) mutations and hepatocellular carcinoma (HCC) remains unclear. Methods We conducted a meta-analysis including cohort and nested case-control studies to prospectively examine the HCC risk associated with common variants of HBV in the PreS, Enhancer II, basal core promoter (BCP) and precore regions. Pertinent studies were identified by searching PubMed, Web of Science and the Chinese Biological Medicine databases through to November 2014. Study-specific risk estimates were combined using fixed or random effects models depending on whether significant heterogeneity was detected. Results Twenty prospective studies were identified, which included 8 cohort and 12 nested case-control studies. There was an increased risk of HCC associated with any PreS mutations with a pooled relative risk (RR) of 3.82 [95% confidence interval (CI): 2.59-5.61]. The pooled-RR for PreS deletion was 3.98 (95% CI: 2.28-6.95), which was higher than that of PreS2 start codon mutation (pooled-RR=2.63, 95% CI: 1.30-5.34). C1653T in Enhancer II was significantly associated with HCC risk (pooled-RR=1.83; 95% CI: 1.21-2.76). For mutations in BCP, statistically significant pooled-RRs of HCC were obtained for T1753V (pooled-RR=2.09; 95% CI: 1.49-2.94) and A1762T/G1764A double mutations (pooled-RR=3.11; 95% CI: 2.08-4.64). No statistically significant association with HCC risk was observed for G1896A in the precore region (pooled-RR=0.77; 95% CI: 0.47-1.26). Conclusions This study demonstrated that PreS mutations, C1653T, T1753V, and A1762T/G1764A, were associated with an increased risk of HCC. Clinical practices concerning the HCC risk prediction and diagnosis may wish to focus on patients with these mutations. PMID:26543337

  7. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in

  8. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  9. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  10. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  11. A Comparison of Remediation Priorities Developed By The Defense Priority Model, The Relative Risk Evaluation Method, and A Quantitative Risk Assessment Approach

    DTIC Science & Technology

    1995-12-01

    SPEARMAN COEFFICIENT OF RANK CORRELATION (R) .............................. 100 vi Abstract The Superfund , established by the Comprehensive...Environmental Response, Compensation, and Liability Act ( CERCLA ) of 1980, seriously underestimated both the number of severely contaminated sites and the...Risk Assessment Approach 1.0 Introduction 1.1 General Issue The Comprehensive Environmental Response, Compensation, and Liability Act ( CERCLA ) of 1980

  12. Breach Risk Magnitude: A Quantitative Measure of Database Security

    PubMed Central

    Yasnoff, William A.

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches. PMID:28269923

  13. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  14. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball.

  15. IWGT report on quantitative approaches to genotoxicity risk ...

    EPA Pesticide Factsheets

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast

  16. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  17. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  18. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  19. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  20. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  1. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  2. Quantitative Methods for Software Selection and Evaluation

    DTIC Science & Technology

    2006-09-01

    Quantitative Methods for Software Selection and Evaluation Michael S. Bandor September 2006 Acquisition Support Program...5 2 Evaluation Methods ...Abstract When performing a “buy” analysis and selecting a product as part of a software acquisition strategy , most organizations will consider primarily

  3. Evaluating Mandibular Cortical Index Quantitatively

    PubMed Central

    Yasar, Fusun; Akgunlu, Faruk

    2008-01-01

    Objectives The aim was to assess whether Fractal Dimension and Lacunarity analysis can discriminate patients having different mandibular cortical shape. Methods Panoramic radiographs of 52 patients were evaluated for mandibular cortical index. Weighted Kappa between the observations were varying between 0.718–0.805. These radiographs were scanned and converted to binary images. Fractal Dimension and Lacunarity were calculated from the regions where best represents the cortical morphology. Results It was found that there were statistically significant difference between the Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 1 and Cl 2 (Fractal Dimension P:0.000; Lacunarity P:0.003); and Cl 1 and Cl 3 cortical morphology (Fractal Dimension P:0.008; Lacunarity P:0.001); but there was no statistically significant difference between Fractal Dimension and Lacunarity of radiographs which were classified as having Cl 2 and Cl 3 cortical morphology (Fractal Dimension P:1.000; Lacunarity P:0.758). Conclusions FD and L can differentiate Cl 1 mandibular cortical shape from both Cl 2 and Cl 3 mandibular cortical shape but cannot differentiate Cl 2 from Cl 3 mandibular cortical shape on panoramic radiographs. PMID:19212535

  4. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  5. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used.

  6. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  7. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  8. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  9. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  10. Quantitative evaluation of signal integrity for magnetocardiography.

    PubMed

    Zhang, Shulin; Wang, Yongliang; Wang, Huiwu; Jiang, Shiqin; Xie, Xiaoming

    2009-08-07

    Magnetocardiography (MCG) is a non-invasive diagnostic tool used to investigate the activity of the heart. For applications in an unshielded environment, in order to extract the very weak signal of interest from the much higher background noise, dedicated hardware configuration and sophisticated signal processing techniques have been developed during the last decades. Being powerful in noise rejection, the signal processing may introduce signal distortions, if not properly designed and applied. However, there is a lack of an effective tool to quantitatively evaluate the signal integrity for MCG at present. In this paper, we have introduced a very simple method by using a small coil driven by a human ECG signal to generate a simulated MCG signal. Three key performance indexes were proposed, which are correlation in time domain, relative heights of different peaks and correlation in frequency domain, to evaluate the MCG system performance quantitatively. This evaluation method was applied to a synthetic gradiometer consisting of a second-order axial gradiometer and three orthogonal reference magnetometers. The evaluation turned out to be very effective in optimizing the parameters for signal processing. In addition, the method can serve as a useful tool for hardware improvement.

  11. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  12. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  13. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  14. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    PubMed

    Liu, Yao; Li, Jianying; Liu, Xiaoyong; Liu, Xudong; Khawar, Waqaar; Zhang, Xinyan; Wang, Fan; Chen, Xiaoxin; Sun, Zheng

    2015-01-01

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC). Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance), and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma). The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102), oral leukoplakia (OLK) patients (n=82), and OSCC patients (n=93), and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR) which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM) was found to be optimal with a high sensitivity (median>0.98) and specificity (median>0.99). With the SVM model, we constructed an oral cancer risk index (OCRI) which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations.

  15. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety.

    PubMed

    van Gerwen, S J; te Giffel, M C; van't Riet, K; Beumer, R R; Zwietering, M H

    2000-06-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly, using order of magnitude estimates. Characteristic numbers are used to quantitatively characterize microbial behaviour during the production process. These numbers help to highlight the major risk-determining phenomena, and to find negligible aspects. Second, the risk-determining phenomena are studied in more detail. Both general and/or specific models can be used for this and varying situations can be simulated to quantitatively describe the risk-determining phenomena. Third, even more detailed studies can be performed where necessary, for instance by using stochastic variables. The system for quantitative risk assessment has been implemented as a decision supporting expert system called SIEFE: Stepwise and Interactive Evaluation of Food safety by an Expert System. SIEFE performs bacterial risk assessments in a structured manner, using various information sources. Because all steps are transparent, every step can easily be scrutinized. In the current study the effectiveness of SIEFE is shown for a cheese spread. With this product, quantitative data concerning the major risk-determining factors were not completely available to carry out a full detailed assessment. However, this did not necessarily hamper adequate risk estimation. Using ranges of values instead helped identifying the quantitatively most important parameters and the magnitude of their impact. This example shows that SIEFE provides quantitative insights into production processes and their risk-determining factors to both risk assessors and decision makers, and highlights critical gaps in knowledge.

  16. Quantitative framework for prospective motion correction evaluation

    PubMed Central

    Pannetier, Nicolas; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert

    2014-01-01

    Purpose Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. Method A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Results Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Conclusion Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. PMID:25761550

  17. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  18. Status and future of Quantitative Microbiological Risk Assessment in China.

    PubMed

    Dong, Q L; Barker, G C; Gorris, L G M; Tian, M S; Song, X Y; Malakar, P K

    2015-03-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives.

  19. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  20. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  1. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  2. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  3. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  4. Quantitative evaluation of chemisorption processes on semiconductors

    NASA Astrophysics Data System (ADS)

    Rothschild, A.; Komem, Y.; Ashkenasy, N.

    2002-12-01

    This article presents a method for numerical computation of the degree of coverage of chemisorbates and the resultant surface band bending as a function of the ambient gas pressure, temperature, and semiconductor doping level. This method enables quantitative evaluation of the effect of chemisorption on the electronic properties of semiconductor surfaces, such as the work function and surface conductivity, which is of great importance for many applications such as solid- state chemical sensors and electro-optical devices. The method is applied for simulating the chemisorption behavior of oxygen on n-type CdS, a process that has been investigated extensively due to its impact on the photoconductive properties of CdS photodetectors. The simulation demonstrates that the chemisorption of adions saturates when the Fermi level becomes aligned with the chemisorption-induced surface states, limiting their coverage to a small fraction of a monolayer. The degree of coverage of chemisorbed adions is proportional to the square root of the doping level, while neutral adsorbates are independent of the doping level. It is shown that the chemisorption of neutral adsorbates behaves according to the well-known Langmuir model, regardless of the existence of charged species on the surface, while charged adions do not obey Langmuir's isotherm. In addition, it is found that in depletive chemisorption processes the resultant surface band bending increases by 2.3kT (where k is the Boltzmann constant and T is the temperature) when the gas pressure increases by one order of magnitude or when the doping level increases by two orders of magnitude.

  5. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  6. Molecular sensitivity threshold of wet mount and an immunochromatographic assay evaluated by quantitative real-time PCR for diagnosis of Trichomonas vaginalis infection in a low-risk population of childbearing women.

    PubMed

    Leli, Christian; Castronari, Roberto; Levorato, Lucia; Luciano, Eugenio; Pistoni, Eleonora; Perito, Stefano; Bozza, Silvia; Mencacci, Antonella

    2016-06-01

    Vaginal trichomoniasis is a sexually transmitted infection caused by Trichomonas vaginalis, a flagellated protozoan. Diagnosis of T. vaginalis infection is mainly performed by wet mount microscopy, with a sensitivity ranging from 38% to 82%, compared to culture, still considered the gold standard. Commercial immunochromatographic tests for monoclonal-antibody-based detection have been introduced as alternative methods for diagnosis of T. vaginalis infection and have been reported in some studies to be more sensitive than wet mount. Real-time PCR methods have been recently developed, with optimal sensitivity and specificity. The aim of this study was to evaluate whether there is a molecular sensitivity threshold for both wet mount and imunochromatographic assays. To this aim, a total of 1487 low-risk childbearing women (median age 32 years, interquartile range 27-37) were included in the study, and underwent vaginal swab for T. vaginalis detection by means of a quantitative real-time PCR assay, wet mount and an immunochromatographic test. Upon comparing the results, prevalence values observed were 1.3% for real-time PCR, 0.5% for microscopic examination, and 0.8% for the immunochromatographic test. Compared to real-time PCR, wet mount sensitivity was 40% (95% confidence interval 19.1% to 63.9%) and specificity was 100% (95% CI 99.7% to 100%). The sensitivity and specificity of the immunochromatographic assay were 57.9% (95% CI 33.5% to 79.8%) and 99.9% (95% CI 99.6% to 100%), respectively. Evaluation of the wet mount results and those of immunochromatographic assay detection in relation to the number of T. vaginalis DNA copies detected in vaginal samples showed that the lower identification threshold for both wet mount (chi-square 6.1; P = 0.016) and the immunochromatographic assay (chi-square 10.7; P = 0.002) was ≥100 copies of T. vaginalis DNA/5 mcl of eluted DNA.

  7. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  8. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  9. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  10. Production Risk Evaluation Program (PREP) - summary

    SciTech Connect

    Kjeldgaard, E.A.; Saloio, J.H.; Vannoni, M.G.

    1997-03-01

    Nuclear weapons have been produced in the US since the early 1950s by a network of contractor-operated Department of Energy (DOE) facilities collectively known as the Nuclear Weapon Complex (NWC). Recognizing that the failure of an essential process might stop weapon production for a substantial period of time, the DOE Albuquerque Operations office initiated the Production Risk Evaluation Program (PREP) at Sandia National Laboratories (SNL) to assess quantitatively the potential for serious disruptions in the NWC weapon production process. PREP was conducted from 1984-89. This document is an unclassified summary of the effort.

  11. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  12. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used.

  13. Quantitative risk analysis for landslides -- Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2004-03-01

    Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative landslide risk analysis the

  14. Milankovitch radiation variations: a quantitative evaluation.

    PubMed

    Shaw, D M; Donn, W L

    1968-12-13

    A quantitative determination of changes in the surface temperature caused by variations in insolation calculated by Milankovitch has been made through the use of the thermodynamic model of Adem. Under extreme conditions, mean coolings of 3.1 degrees and 2.7 degrees C, respectively, at latitudes 25 degrees and 65 degrees N are obtained for Milankovitch radiation cycles. At the sensitive latitude 65 degrees N, a mean cooling below the present temperature for each of the times of radiation minimum is only 1.4 degrees C. This result indicates that the Milankovitch effect is rather small to have triggered glacial climates.

  15. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  16. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  17. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  18. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  19. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  20. A quantitative risk assessment model for Salmonella and whole chickens.

    PubMed

    Oscar, Thomas P

    2004-06-01

    Existing data and predictive models were used to define the input settings of a previously developed but modified quantitative risk assessment model (QRAM) for Salmonella and whole chickens. The QRAM was constructed in an Excel spreadsheet and was simulated using @Risk. The retail-to-table pathway was modeled as a series of unit operations and associated pathogen events that included initial contamination at retail, growth during consumer transport, thermal inactivation during cooking, cross-contamination during serving, and dose response after consumption. Published data as well as predictive models for growth and thermal inactivation of Salmonella were used to establish input settings. Noncontaminated chickens were simulated so that the QRAM could predict changes in the incidence of Salmonella contamination. The incidence of Salmonella contamination changed from 30% at retail to 0.16% after cooking to 4% at consumption. Salmonella growth on chickens during consumer transport was the only pathogen event that did not impact the risk of salmonellosis. For the scenario simulated, the QRAM predicted 0.44 cases of salmonellosis per 100,000 consumers, which was consistent with recent epidemiological data that indicate a rate of 0.66-0.88 cases of salmonellosis per 100,000 consumers of chicken. Although the QRAM was in agreement with the epidemiological data, surrogate data and models were used, assumptions were made, and potentially important unit operations and pathogen events were not included because of data gaps and thus, further refinement of the QRAM is needed.

  1. Software design for professional risk evaluation

    NASA Astrophysics Data System (ADS)

    Ionescu, V.; Calea, G.; Amza, G.; Iacobescu, G.; Nitoi, D.; Dimitrescu, A.

    2016-08-01

    Professional risk evaluation represents a complex activity involving each economic operator, with important repercussion upon health and security in work. Article represents an innovative study method, regarding professional risk analyze in which cumulative working posts are evaluated. Work presents a new software that helps in putting together all the working positions from a complex organizational system and analyzing them in order to evaluate the possible risks. Using this software, a multiple analysis can be done like: risk estimation, risk evaluation, estimation of residual risks and finally searching of risk reduction measures.

  2. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    PubMed Central

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  3. A quantitative evaluation of alcohol withdrawal tremors.

    PubMed

    Aarabi, Parham; Norouzi, Narges; Dear, Taylor; Carver, Sally; Bromberg, Simon; Gray, Sara; Kahan, Mel; Borgundvaag, Bjug

    2015-01-01

    This paper evaluates the relation between Alcohol Withdrawal Syndrome tremors in the left and right hands of patients. By analyzing 122 recordings from 61 patients in emergency departments, we found a weak relationship between the left and right hand tremor frequencies (correlation coefficient of 0.63). We found a much stronger relationship between the expert physician tremor ratings (on CIWA-Ar 0-7 scale) of the two hands, with a correlation coefficient of 0.923. Next, using a smartphone to collect the tremor data and using a previously developed model for obtaining estimated tremor ratings, we also found a strong correlation (correlation coefficient of 0.852) between the estimates of each hand. Finally, we evaluated different methods of combining the data from the two hands for obtaining a single tremor rating estimate, and found that simply averaging the tremor ratings of the two hands results in the lowest tremor estimate error (an RMSE of 0.977). Looking at the frequency dependence of this error, we found that higher frequency tremors had a much lower estimation error (an RMSE of 1.102 for tremors with frequencies in the 3-6Hz range as compared to 0.625 for tremors with frequencies in the 7-10Hz range).

  4. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  5. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  6. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  7. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... HUMAN SERVICES Food and Drug Administration Quantitative Summary of the Benefits and Risks of... ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review'' (literature review... FDA is announcing the availability of a draft report entitled ``Quantitative Summary of the...

  8. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  9. Quantitative evaluation of ocean thermal energy conversion (OTEC): executive briefing

    SciTech Connect

    Gritton, E.C.; Pei, R.Y.; Hess, R.W.

    1980-08-01

    Documentation is provided of a briefing summarizing the results of an independent quantitative evaluation of Ocean Thermal Energy Conversion (OTEC) for central station applications. The study concentrated on a central station power plant located in the Gulf of Mexico and delivering power to the mainland United States. The evaluation of OTEC is based on three important issues: resource availability, technical feasibility, and cost.

  10. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  11. Quantitative evaluation fo cerebrospinal fluid shunt flow

    SciTech Connect

    Chervu, S.; Chervu, L.R.; Vallabhajosyula, B.; Milstein, D.M.; Shapiro, K.M.; Shulman, K.; Blaufox, M.D.

    1984-01-01

    The authors describe a rigorous method for measuring the flow of cerebrospinal fluid (CSF) in shunt circuits implanted for the relief of obstructive hydrocephalus. Clearance of radioactivity for several calibrated flow rates was determined with a Harvard infusion pump by injecting the Rickham reservoir of a Rickham-Holter valve system with 100 ..mu..Ci of Tc-99m as pertechnetate. The elliptical and the cylindrical Holter valves used as adjunct valves with the Rickham reservoir yielded two different regression lines when the clearances were plotted against flow rats. The experimental regression lines were used to determine the in vivo flow rates from clearances calculated after injecting the Rickham reservoirs of the patients. The unique clearance characteristics of the individual shunt systems available requires that calibration curves be derived for an entire system identical to one implanted in the patient being evaluated, rather than just the injected chamber. Excellent correlation between flow rates and the clinical findings supports the reliability of this method of quantification of CSF shunt flow, and the results are fully accepted by neurosurgeons.

  12. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: I. Quantitative exposure assessment.

    PubMed

    Pouillot, Régis; Miconnet, Nicolas; Afchain, Anne-Laure; Delignette-Muller, Marie Laure; Beaufort, Annie; Rosso, Laurent; Denis, Jean-Baptiste; Cornu, Marie

    2007-06-01

    A quantitative assessment of the exposure to Listeria monocytogenes from cold-smoked salmon (CSS) consumption in France is developed. The general framework is a second-order (or two-dimensional) Monte Carlo simulation, which characterizes the uncertainty and variability of the exposure estimate. The model takes into account the competitive bacterial growth between L. monocytogenes and the background competitive flora from the end of the production line to the consumer phase. An original algorithm is proposed to integrate this growth in conditions of varying temperature. As part of a more general project led by the French Food Safety Agency (Afssa), specific data were acquired and modeled for this quantitative exposure assessment model, particularly time-temperature profiles, prevalence data, and contamination-level data. The sensitivity analysis points out the main influence of the mean temperature in household refrigerators and the prevalence of contaminated CSS on the exposure level. The outputs of this model can be used as inputs for further risk assessment.

  13. A Comprehensive Quantitative Assessment of Bird Extinction Risk in Brazil

    PubMed Central

    Machado, Nathália; Loyola, Rafael Dias

    2013-01-01

    In an effort to avoid species loss, scientists have focused their efforts on the mechanisms making some species more prone to extinction than others. However, species show different responses to threats given their evolutionary history, behavior, and intrinsic biological features. We used bird biological features and external threats to (1) understand the multiple pathways driving Brazilian bird species to extinction, (2) to investigate if and how extinction risk is geographically structured, and (3) to quantify how much diversity is currently represented inside protected areas. We modeled the extinction risk of 1557 birds using classification trees and evaluated the relative contribution of each biological feature and external threat in predicting extinction risk. We also quantified the proportion of species and their geographic range currently protected by the network of Brazilian protected areas. The optimal classification tree showed different pathways to bird extinction. Habitat conversion was the most important predictor driving extinction risk though other variables, such as geographic range size, type of habitat, hunting or trapping and trophic guild, were also relevant in our models. Species under higher extinction risk were concentrated mainly in the Cerrado Biodiversity Hotspot and were not quite represented inside protected areas, neither in richness nor range. Predictive models could assist conservation actions, and this study could contribute by highlighting the importance of natural history and ecology in these actions. PMID:23951302

  14. A comprehensive quantitative assessment of bird extinction risk in Brazil.

    PubMed

    Machado, Nathália; Loyola, Rafael Dias

    2013-01-01

    In an effort to avoid species loss, scientists have focused their efforts on the mechanisms making some species more prone to extinction than others. However, species show different responses to threats given their evolutionary history, behavior, and intrinsic biological features. We used bird biological features and external threats to (1) understand the multiple pathways driving Brazilian bird species to extinction, (2) to investigate if and how extinction risk is geographically structured, and (3) to quantify how much diversity is currently represented inside protected areas. We modeled the extinction risk of 1557 birds using classification trees and evaluated the relative contribution of each biological feature and external threat in predicting extinction risk. We also quantified the proportion of species and their geographic range currently protected by the network of Brazilian protected areas. The optimal classification tree showed different pathways to bird extinction. Habitat conversion was the most important predictor driving extinction risk though other variables, such as geographic range size, type of habitat, hunting or trapping and trophic guild, were also relevant in our models. Species under higher extinction risk were concentrated mainly in the Cerrado Biodiversity Hotspot and were not quite represented inside protected areas, neither in richness nor range. Predictive models could assist conservation actions, and this study could contribute by highlighting the importance of natural history and ecology in these actions.

  15. Quantitative risk assessment of thermophilic Campylobacter spp. and cross-contamination during handling of raw broiler chickens evaluating strategies at the producer level to reduce human campylobacteriosis in Sweden.

    PubMed

    Lindqvist, Roland; Lindblad, Mats

    2008-01-15

    Campylobacter is a major bacterial cause of infectious diarrheal illness in Sweden and in many other countries. Handling and consumption of chicken has been identified as important risk factors. The purpose of the present study was to use data from a national baseline study of thermophilic Campylobacter spp. in raw Swedish broiler chickens in order to evaluate some risk management strategies and the frequency of consumer mishandling, i.e., handling leading to possible cross-contamination. A probabilistic model describing variability but not uncertainty was developed in Excel and @Risk. The output of the model was the probability of illness per handling if the chicken was mishandled. Uncertainty was evaluated by performing repeated simulations and substituting model parameters, distributions and software (Analytica). The effect of uncertainty was within a factor of 3.2 compared to the baseline scenario. For Campylobacter spp. prevalence but not concentration, there was a one-to-one relation with risk. The effect of a 100-fold reduction in the levels of Campylobacter spp. on raw chicken reduced the risk by a factor of 12 (fresh chicken) to 30 (frozen chicken). Highly-contaminated carcasses contributed most to risk and it was estimated that by limiting the contamination to less than 4 log CFU per carcass, the risk would be reduced to less than 17% of the baseline scenario. Diverting all positive flocks to freezing was estimated to result in 43% as many cases as the baseline. The second best diversion option (54% of baseline cases) was to direct all chickens from the two worst groups of producers, in terms of percentages of positive flocks delivered, to freezing. The improvement of using diverting was estimated to correspond to between 5 to 767 fewer reported cases for the different strategies depending on the assumptions of the proportion of reported cases (1 to 50%) caused by Campylobacter spp. from Swedish chicken. The estimated proportion of consumer mishandlings

  16. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  17. Quantitative landslide risk analysis Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2003-04-01

    Risk analysis, risk evaluation and risk management are integrated in the holistic concept of risk assessment. Internationally, various quantitative, semiquantitative and qualitative approaches exist to analyse the risk to life and/or the economic risk caused by landslides. In Iceland, a method to carry out snow avalanche risk analysis was developed in 1999, followed by rough guidelines on how to integrate results from landslide hazard assessments into a comprehensive landslide and snow avalanche risk assessment in 2002. The Icelandic regulation on hazard zoning due to snow avalanches and landslides, issued by the Icelandic Ministry of the Environment in the year 2000, aims to prevent people living or working within the areas most at risk, until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, an approach to calculate landslide risk in detail is still missing. Therefore, the ultimate goal of this study is to develop such a method and apply it in Bildudalur, NW-Iceland. Within this presentation, the risk analysis focuses on the risks to loose life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, under the consideration of changing vulnerabilities must be determined. Based on existent debris flow and rock fall run-out maps, hazard maps are derived and the respective risks are calculated. Already digitized elements at risk (people in houses) are verified and updated. The damage potential (the number of all of the people living or working at a specific location), derived from official statistics and own investigations, are attributed to each house. The vulnerability of the elements at risk is mainly based on literature studies. The probability of spatial impact (i.e. of the hazardous event impacting a building) is estimated using benchmarks given in literature, results from field

  18. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment.

  19. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  20. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  1. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets.

  2. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels.

  3. Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi-Directional Road Tunnel.

    PubMed

    Caliendo, Ciro; De Guglielmo, Maria Luisa

    2017-01-01

    A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs.

  4. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  5. The Pesticide Risk Beliefs Inventory: a quantitative instrument for the assessment of beliefs about pesticide risks.

    PubMed

    LePrevost, Catherine E; Blanchard, Margaret R; Cope, W Gregory

    2011-06-01

    Recent media attention has focused on the risks that agricultural pesticides pose to the environment and human health; thus, these topics provide focal areas for scientists and science educators to enhance public understanding of basic toxicology concepts. This study details the development of a quantitative inventory to gauge pesticide risk beliefs. The goal of the inventory was to characterize misconceptions and knowledge gaps, as well as expert-like beliefs, concerning pesticide risk. This study describes the development and field testing of the Pesticide Risk Beliefs Inventory with an important target audience: pesticide educators in a southeastern U.S. state. The 19-item, Likert-type inventory was found to be psychometrically sound with a Cronbach's alpha of 0.780 and to be a valuable tool in capturing pesticide educators' beliefs about pesticide risk, assessing beliefs in four key categories. The Pesticide Risk Beliefs Inventory could be useful in exploring beliefs about pesticide risks and in guiding efforts to address misconceptions held by a variety of formal and informal science learners, educators, practitioners, the agricultural labor force, and the general public.

  6. Quantitative Synthesis: An Actuarial Base for Planning Impact Evaluations.

    ERIC Educational Resources Information Center

    Cordray, David S.; Sonnefeld, L. Joseph

    1985-01-01

    There are numerous micro-level methods decisions associated with planning an impact evaluation. Quantitative synthesis methods can be used to construct an actuarial data base for establishing the likelihood of achieving desired sample sizes, statistical power, and measurement characteristics. (Author/BS)

  7. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  8. Quantitative Integrated Evaluation in the Mars Basin, Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Tichelaar, B. W.; Detomo, R.

    2005-05-01

    Today's exploitation of hydrocarbons in the Deepwater Gulf of Mexico requires a subtle, sophisticated class of opportunities for which uncertainties must be quantified to reduce risk. The explorer is often faced with non-amplitude supported hydrocarbon accumulations, limitations of seismic imaging, and uncertainty in stratigraphy and hydrocarbon kitchens, all in an environment of still-maturing technology and rising drilling costs. However, many of the fundamental Exploration processes that drove the industry in the past in the Gulf of Mexico still apply today. Integration of these historically proven processes with each other and with new technologies, supported by a growing body of knowledge, has provided a significant new methodology for wildcat and near-field Exploration. Even in mature fields, additional opportunities are seldom characterized by unambiguous attributes of direct hydrocarbon indicators or amplitude support. Shell's Quantitative Integrated Evaluation process relies upon visualization of integrated volume-based stratigraphic models of rock and fluid properties, and by relating these properties to measured and predicted seismic responses. An attribute referred to as the Differential Generalized Attribute, which summarizes the differences between multiple scenario response predictions and actual measured data, can then be used to distinguish likely scenarios from unlikely scenarios. This methodology allows competing scenarios to be rapidly tested against the data, and is built upon proprietary knowledge of the physical processes and relationships that likely drive vertical and lateral variation in these models. We will demonstrate the methodology by showing a portion of the Mars Basin and describing the integrated capability that is emplaced at the Exploration phase, and matured throughout the Appraisal, Development and Production life cycle of a basin discovery.

  9. Quantitative Evaluation of Plant Actin Cytoskeletal Organization During Immune Signaling.

    PubMed

    Lu, Yi-Ju; Day, Brad

    2017-01-01

    High spatial and temporal resolution microscopy-based methods are valuable tools for the precise real-time imaging of changes in cellular organization in response to stimulus perception. Here, we describe a quantitative method for the evaluation of the plant actin cytoskeleton during immune stimulus perception and the activation of defense signaling. As a measure of the biotic stress-induced changes in actin filament organization, we present methods for analyzing changes in actin filament organization following elicitation of pattern-triggered immunity and effector-triggered immunity. Using these methods, it is possible to not only quantitatively evaluate changes in actin cytoskeletal organization following biotic stress perception, but to also use these protocols to assess changes in actin filament organization following perception of a wide range of stimuli, including abiotic and developmental cues. As described herein, we present an example application of this method, designed to evaluate changes in actin cytoskeletal organization following pathogen perception and immune signaling.

  10. Quantitative and Public Perception of Landslide Risk in Badulla, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Gunasekera, R.; Bandara, R. M. S.; Mallawatantri, A.; Saito, K.

    2009-04-01

    Landslides are often triggered by intense precipitation and are exacerbated by increased urbanisation and human activity. There is a significant risk of large scale landslides in Sri Lanka and when they do occur, they have the potential to cause devastation to property, lives and livelihoods. There are several high landslide risk areas in seven districts (Nuwara Eliya, Badulla, Ratnapura, Kegalle, Kandy, Matale and Kalutara) in Sri Lanka. These are also some of the poorest areas in the country and consequently the recovery process after catastrophic landslides become more problematic. Therefore landslide risk management is an important concern in poverty reduction strategies. We focused on the district of Badulla, Sri Lanka to evaluate the a) quantitative scientific analysis of landslide risk and b) qualitative public perception of landslides in the area. Combining high resolution, hazard and susceptibility data we quantified the risk of landslides in the area. We also evaluated the public perception of landslides in the area using participatory GIS techniques. The evaluation of public perception of landslide risk has been complemented by use of Landscan data. The framework of the methodology for Landscan data is based on using the second order administrative population data from census, each 30 arc-second cell within the administrative units receives a probability coefficient based on slope, proximity to roads and land cover. Provision of this information from these complementary methods to the regional planners help to strengthen the disaster risk reduction options and improving sustainable land use practices through enhanced public participation in the decision making and governance processes.

  11. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  12. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  13. Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.

    PubMed

    Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K

    2017-02-01

    Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.

  14. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  15. Quantitative evaluation of CBM reservoir fracturing quality using logging data

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoyan

    2017-03-01

    This paper presents a method for the quantitative evaluation of fracturing quality of coalbed methane (CBM) reservoirs using logging data, which will help optimize the reservoir fracturing layer. First, to make full use of logging and laboratory analysis data of coal cores, a method to determine the brittleness index of CBM reservoirs is deduced using coal industrial components. Second, this paper briefly introduces methodology to compute the horizontal principal stress difference coefficient of coal seams and the minimum horizontal principal stress difference of coal seams and roof and floor. Third, an evaluation model for the coal structure index is established using logging data, which fully considers the fracturing quality of CBM reservoirs affected by the coal structure. Fourth, the development degree of the coal reservoir is evaluated. The evaluation standard for fracturing quality of CBM reservoirs based on these five evaluation parameters is used for quantitative evaluation. The results show that the combination of methods proposed in this paper are effective. The results are consistent with the fracturing dynamic drainage. The coal seam with large brittleness index, large stress difference between the coal seam and roof and floor, small stress difference coefficient and high coal structure index has a strong fracturing quality.

  16. Reducing hospital-acquired infection by quantitative risk modeling of intravenous bag preparation.

    PubMed

    Tidswell, Edward C; Rockwell, Jim; Wright, Marc-Oliver

    2010-01-01

    Vascular access of patients by peripheral and central venous catheters for the delivery of sterile or aseptically manufactured parenterals is commonly regarded as one of the major causes of blood stream infections. Rigorous evaluation and management of the risks of microbial infection originating from the administration of aseptically manufactured therapies remain imperative to reduce patient infection risks. Healthcare clinicians are continually faced with choosing intravenous (IV) parenteral administration strategies to minimize patient blood stream infection risk. Data facilitating such decisions are often difficult to obtain. Analysis and interpretation of the available, reported hospital infection rate data to evaluate medical device- and therapy-associated infection rates are constrained by the variability and uncertainty associated with each individual administration scenario. Moreover, clinical trials quantifying infection risk are constrained by their practicality, cost, and the control of the exacting requisite trial criteria. Furthermore, it is ethically inappropriate to systematically conduct clinical evaluations incorporating conditions that do not favor the best possible patient outcomes. Quantitative risk modeling (QRM) is a unique tool offering an alternative and affective means of assessing design and clinical use in the context of the clinical environment on medical device and combinatorial therapy infection rates. Here, we report the generation of QRMs and the evaluation of manual admixing IV bags for use in IV administration sets upon patient infection rates. The manual admixing of IV bags was assessed for the opportunity and risk of microbial ingress accessing across the sterile barrier during clinical preparation and contaminating the IV solution. The risk of microbial contamination was evaluated under (a) ISO 5 compounding conditions adopting ideal aseptic technique (in compliance with USP 〈797〉) and (b) realistic worst-case point

  17. Quantitative risk assessment of noroviruses in drinking water based on qualitative data in Japan.

    PubMed

    Masago, Yoshifumi; Katayama, Hiroyuki; Watanabe, Toru; Haramoto, Eiji; Hashimoto, Atsushi; Omura, Tatsuo; Hirata, Tsuyoshi; Ohgaki, Shinichiro

    2006-12-01

    Noroviruses are one of the major causes of viral gastroenteritis in Japan. A quantitative risk assessment was conducted to evaluate the health risk caused by this virus in drinking water. A Monte Carlo analysis was used to calculate both the probability of infection and the disease burden using disability-adjusted life years (DALYs). The concentration of noroviruses in tap water was estimated based on qualitative data and a most probable number (MPN) method with an assumed Poisson lognormal distribution. This numerical method was evaluated using two sets of available count data of Cryptosporidium: that collected from a river and that found in tap water in Japan. The dose-response relationships for noroviruses were estimated using assumed ID50 (10 or 100). The annual risk was higher than the US-EPA acceptable level (10(-4) [infection/ person-year]) but around the WHO level (10(-6) [DALYs/ person-year]). As suggested by others, since microbial concentrations are generally lognormally distributed, the arithmetic mean was directly related to the annual risk, suggesting that the arithmetic mean is more useful in representing the degree of microbial contamination than the geometric mean.

  18. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.

  19. The quantitative estimation of IT-related risk probabilities.

    PubMed

    Herrmann, Andrea

    2013-08-01

    How well can people estimate IT-related risk? Although estimating risk is a fundamental activity in software management and risk is the basis for many decisions, little is known about how well IT-related risk can be estimated at all. Therefore, we executed a risk estimation experiment with 36 participants. They estimated the probabilities of IT-related risks and we investigated the effect of the following factors on the quality of the risk estimation: the estimator's age, work experience in computing, (self-reported) safety awareness and previous experience with this risk, the absolute value of the risk's probability, and the effect of knowing the estimates of the other participants (see: Delphi method). Our main findings are: risk probabilities are difficult to estimate. Younger and inexperienced estimators were not significantly worse than older and more experienced estimators, but the older and more experienced subjects better used the knowledge gained by knowing the other estimators' results. Persons with higher safety awareness tend to overestimate risk probabilities, but can better estimate ordinal ranks of risk probabilities. Previous own experience with a risk leads to an overestimation of its probability (unlike in other fields like medicine or disasters, where experience with a disease leads to more realistic probability estimates and nonexperience to an underestimation).

  20. LSST Painting Risk Evaluation Memo

    SciTech Connect

    Wolfe, Justin E.

    2016-11-10

    The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.

  1. Approach for evaluating inundation risks in urban drainage systems.

    PubMed

    Zhu, Zhihua; Chen, Zhihe; Chen, Xiaohong; He, Peiying

    2016-05-15

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers. Hence, inundation evaluation is becoming increasingly important around the world. This comprehensive assessment involves numerous indices in urban catchments, but the high-dimensional and non-linear relationship between the indices and the risk presents an enormous challenge for accurate evaluation. Therefore, an approach is hereby proposed to qualitatively and quantitatively evaluate inundation risks in urban drainage systems based on a storm water management model, the projection pursuit method, the ordinary kriging method and the K-means clustering method. This approach is tested using a residential district in Guangzhou, China. Seven evaluation indices were selected and twenty rainfall-runoff events were used to calibrate and validate the parameters of the rainfall-runoff model. The inundation risks in the study area drainage system were evaluated under different rainfall scenarios. The following conclusions are reached. (1) The proposed approach, without subjective factors, can identify the main driving factors, i.e., inundation duration, largest water flow and total flood amount in this study area. (2) The inundation risk of each manhole can be qualitatively analyzed and quantitatively calculated. There are 1, 8, 11, 14, 21, and 21 manholes at risk under the return periods of 1-year, 5-years, 10-years, 20-years, 50-years and 100-years, respectively. (3) The areas of levels III, IV and V increase with increasing rainfall return period based on analyzing the inundation risks for a variety of characteristics. (4) The relationships between rainfall intensity and inundation-affected areas are revealed by a logarithmic model. This study proposes a novel and successful approach to assessing risk in urban drainage systems and provides guidance for improving urban drainage systems and inundation preparedness.

  2. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed

    Hertzberg, Richard C; Teuschler, Linda K

    2002-12-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions.

  3. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    PubMed

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  4. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  5. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  6. Evaluation of protective action risks

    SciTech Connect

    Witzig, W.F.; Shillenn, J.K.

    1987-06-01

    The purpose of this study is to determine how the risks of the protective action of evacuation compare with the radiological risks from a radiation release if no protective actions are taken. Evacuation risks of death and injury have been determined by identifying from newspapers and other sources 902 possible evacuation events which occurred in the US during the period January 1, 1973 through April 30, 1986. A survey form was developed to determine evacuation risks and other information relating to the evacuation events and sent to local emergency management personnel located in the vicinity of 783 events. There were 310 completed surveys received and the data summarized. This study found that the key factors for a successful evacuation included an emergency plan, good communications and coordination, practice drills, and defined authority. Few successful evacuations used the emergency broadcasting system or warning sirens to communicate the need to evacuate. Reports of panic and traffic jams during an evacuation were very few. Traffic jams occurring during reentry were more likely than during the evacuation exodus. A summary of potential societal consequences of evacuation is included in this study. 5 refs., 9 figs., 20 tabs.

  7. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  8. Modelling bacterial growth in quantitative microbiological risk assessment: is it possible?

    PubMed

    Nauta, Maarten J

    2002-03-01

    Quantitative microbiological risk assessment (QMRA), predictive modelling and HACCP may be used as tools to increase food safety and can be integrated fruitfully for many purposes. However, when QMRA is applied for public health issues like the evaluation of the status of public health, existing predictive models may not be suited to model bacterial growth. In this context, precise quantification of risks is more important than in the context of food manufacturing alone. In this paper, the modular process risk model (MPRM) is briefly introduced as a QMRA modelling framework. This framework can be used to model the transmission of pathogens through any food pathway, by assigning one of six basic processes (modules) to each of the processing steps. Bacterial growth is one of these basic processes. For QMRA, models of bacterial growth need to be expressed in terms of probability, for example to predict the probability that a critical concentration is reached within a certain amount of time. In contrast, available predictive models are developed and validated to produce point estimates of population sizes and therefore do not fit with this requirement. Recent experience from a European risk assessment project is discussed to illustrate some of the problems that may arise when predictive growth models are used in QMRA. It is suggested that a new type of predictive models needs to be developed that incorporates modelling of variability and uncertainty in growth.

  9. Quantitative microbial risk assessment of distributed drinking water using faecal indicator incidence and concentrations.

    PubMed

    van Lieverloo, J Hein M; Blokker, E J Mirjam; Medema, Gertjan

    2007-01-01

    Quantitative Microbial Risk Assessments (QMRA) have focused on drinking water system components upstream of distribution to customers, for nominal and event conditions. Yet some 15-33% of waterborne outbreaks are reported to be caused by contamination events in distribution systems. In the majority of these cases and probably in all non-outbreak contamination events, no pathogen concentration data was available. Faecal contamination events are usually detected or confirmed by the presence of E. coli or other faecal indicators, although the absence of this indicator is no guarantee of the absence of faecal pathogens. In this paper, the incidence and concentrations of various coliforms and sources of faecal contamination were used to estimate the possible concentrations of faecal pathogens and consequently the infection risks to consumers in event-affected areas. The results indicate that the infection risks may be very high, especially from Campylobacter and enteroviruses, but also that the uncertainties are very high. The high variability of pathogen to thermotolerant coliform ratios estimated in environmental samples severely limits the applicability of the approach described. Importantly, the highest ratios of enteroviruses to thermotolerant coliform were suggested from soil and shallow groundwaters, the most likely sources of faecal contamination that are detected in distribution systems. Epidemiological evaluations of non-outbreak faecal contamination of drinking water distribution systems and thorough tracking and characterisation of the contamination sources are necessary to assess the actual risks of these events.

  10. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients.

  11. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  12. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  13. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  14. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  15. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  16. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  17. Factors Distinguishing between Achievers and At Risk Students: A Qualitative and Quantitative Synthesis

    ERIC Educational Resources Information Center

    Eiselen, R.; Geyser, H.

    2003-01-01

    The purpose of this article is to identify factors that distinguish between Achievers and At Risk Students in Accounting 1A, and to explore how qualitative and quantitative research methods complement each other. Differences between the two groups were explored from both a quantitative and a qualitative perspective, focusing on study habits,…

  18. Review of progress in quantitative nondestructive evaluation. Vol. 3B

    SciTech Connect

    Thompson, D.O.; Chimenti, D.E.

    1984-01-01

    This two-book volume constitutes the Proceedings of the Tenth Annual Review of Progress in Quantitative Nondestructive Evaluation held in California in 1983. Topics considered include nondestructive evaluation (NDE) reliability, ultrasonics (probability of detection, scattering, sizing, transducers, signal processing, imaging and reconstruction), eddy currents (probability of detection, modeling, sizing, probes), acoustic emission, thermal wave imaging, optical techniques, new techniques (e.g., maximum entropy reconstruction, near-surface inspection of flaws using bulk ultrasonic waves, inversion and reconstruction), composite materials, material properties, acoustoelasticity, residual stress, and new NDE systems (e.g., retirement-for-cause procedures for gas turbine engine components, pulsed eddy current flaw detection and characterization, an ultrasonic inspection protocol for IN100 jet engine materials, electromagnetic on-line monitoring of rotating turbine-generator components). Basic research and early engineering applications are emphasized.

  19. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  20. The Nuclear Renaissance - Implications on Quantitative Nondestructive Evaluations

    SciTech Connect

    Matzie, Regis A.

    2007-03-21

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  1. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    PubMed

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk.

  2. A quantitative analysis of fish consumption and stroke risk.

    PubMed

    Bouzan, Colleen; Cohen, Joshua T; Connor, William E; Kris-Etherton, Penny M; Gray, George M; König, Ariane; Lawrence, Robert S; Savitz, David A; Teutsch, Steven M

    2005-11-01

    Although a rich source of n-3 polyunsaturated fatty acids (PUFAs) that may confer multiple health benefits, some fish contain methyl mercury (MeHg), which may harm the developing fetus. U.S. government recommendations for women of childbearing age are to modify consumption of high-MeHg fish to reduce MeHg exposure, while recommendations encourage fish consumption among the general population because of the nutritional benefits. The Harvard Center for Risk Analysis convened an expert panel (see acknowledgements) to quantify the net impact of resulting hypothetical changes in fish consumption across the population. This paper estimates the impact of fish consumption on stroke risk. Other papers quantify coronary heart disease mortality risk and the impacts of both prenatal MeHg exposure and maternal intake of n-3 PUFAs on cognitive development. This analysis identified articles in a recent qualitative literature review that are appropriate for the development of a dose-response relationship between fish consumption and stroke risk. Studies had to satisfy quality criteria, quantify fish intake, and report the precision of the relative risk estimates. The analysis combined the relative risk results, weighting each proportionately to its precision. Six studies were identified as appropriate for inclusion in this analysis, including five prospective cohort studies and one case-control study (total of 24 exposure groups). Our analysis indicates that any fish consumption confers substantial relative risk reduction compared to no fish consumption (12% for the linear model), with the possibility that additional consumption confers incremental benefits (central estimate of 2.0% per serving per week).

  3. QMRAspot: a tool for Quantitative Microbial Risk Assessment from surface water to potable water.

    PubMed

    Schijven, Jack F; Teunis, Peter F M; Rutjes, Saskia A; Bouwknegt, Martijn; de Roda Husman, Ana Maria

    2011-11-01

    In the Netherlands, a health based target for microbially safe drinking water is set at less than one infection per 10,000 persons per year. For the assessment of the microbial safety of drinking water, Dutch drinking water suppliers must conduct a Quantitative Microbial Risk Assessment (QMRA) at least every three years for the so-called index pathogens enterovirus, Campylobacter, Cryptosporidium and Giardia. In order to collect raw data in the proper format and to automate the process of QMRA, an interactive user-friendly computational tool, QMRAspot, was developed to analyze and conduct QMRA for drinking water produced from surface water. This paper gives a description of the raw data requirements for QMRA as well as a functional description of the tool. No extensive prior knowledge about QMRA modeling is required by the user, because QMRAspot provides guidance to the user on the quantity, type and format of raw data and performs a complete analysis of the raw data to yield a risk outcome for drinking water consumption that can be compared with other production locations, a legislative standard or an acceptable health based target. The uniform approach promotes proper collection and usage of raw data and, warrants quality of the risk assessment as well as enhances efficiency, i.e., less time is required. QMRAspot may facilitate QMRA for drinking water suppliers worldwide. The tool aids policy makers and other involved parties in formulating mitigation strategies, and prioritization and evaluation of effective preventive measures as integral part of water safety plans.

  4. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  5. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated.

  6. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  7. D & D screening risk evaluation guidance

    SciTech Connect

    Robers, S.K.; Golden, K.M.; Wollert, D.A.

    1995-09-01

    The Screening Risk Evaluation (SRE) guidance document is a set of guidelines provided for the uniform implementation of SREs performed on decontamination and decommissioning (D&D) facilities. Although this method has been developed for D&D facilities, it can be used for transition (EM-60) facilities as well. The SRE guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the current risk to human health and the environment, exterior to the building, from ongoing or probable releases within a one-year time period. The Worker Exposure Index (WEI) calculates the current risk to workers, occupants and visitors inside contaminated D&D facilities due to contaminant exposure. The Future Release Index (FRI) calculates the hypothetical risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risks to human health due to factors other than that of contaminants. Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form, and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, as determined on a project-by-project basis.

  8. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  9. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  10. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  11. A qualitative and quantitative risk assessment of snuff dipping.

    PubMed

    Nilsson, R

    1998-08-01

    The presence of highly carcinogenic tobacco-specific nitrosamines (TSNA) in snuff has been a matter of serious concern. However, the levels of TSNA in such products may differ by orders of magnitude depending on origin and manner of processing, and the mere presence of such agents at low levels does hardly constitute a meaningful prerequisite for classifying all types of snuff as human carcinogens. Reviewing available epidemiological evidence, a wide discrepancy is found for estimated cancer risk associated with snuff dipping derived from on one hand previous investigations conducted in the United States and on the other from recent extensive Swedish epidemiological studies. In spite of the fact that approximately 20% of all grown-up Swedish males use moist snuff, it has not been possible to detect any significant increase in the incidence of cancer of the oral cavity or pharynx-the prevalence of which by international standards remains low in this country. Further, there is insufficient evidence for a causal link between the use of Swedish snuff and increased risk for cardiovascular disease. Dissimilarities in the content of TSNA in oral snuff products may represent one important reason for the different outcomes of the epidemiological surveys conducted in the United States and Sweden. Bioassays using pure TSNA in rodents appear to give exaggerated risk estimates for humans, a discrepancy that could be ascribed to species-related differences in the relation between exposure and DNA target dose and/or adduct repair rates, as well as to the presence of anticarcinogens in snuff. Although a small risk cannot be excluded, the use of smokeless tobacco products low in TSNA which now are available on the market entails a risk that at any rate is more than 10 times lower than that associated with active smoking. Nevertheless, due to the decisive role of potent TSNA in determining possible cancer risks in users of smokeless tobacco, and due to the fact that large variations

  12. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  13. A Quantitative Risk Analysis of Deficient Contractor Business System

    DTIC Science & Technology

    2012-04-30

    August). Risk management guide for DoD acquisition. Washington, DC: Author. Appendix A. ANSI/ EIA - 748 Earned Value Management System (EVMS) Guidelines...Fleming & Koppelmann, 2005, pp. 191–214) ANSI/ EIA - 748 GUIDELINES ORGANIZATION Criterion 1. Define the authorized work elements for the agency. A WBS...12 Future Research Research Topic 2 Under ANSI/ EIA - 748 , Earned Value Management Systems (EVMS) must comply with 32 guidelines. When a

  14. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  15. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    NASA Astrophysics Data System (ADS)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  16. Shadow photogrammetric apparatus for the quantitative evaluation of corneal buttons.

    PubMed

    Denham, D; Mandelbaum, S; Parel, J M; Holland, S; Pflugfelder, S; Parel, J M

    1989-11-01

    We have developed a technique for the accurate, quantitative, geometric evaluation of trephined and punched corneal buttons. A magnified shadow of the frontal and edge views of a corneal button mounted on the rotary stage of a modified optical comparator is projected onto the screen of the comparator and photographed. This process takes approximately three minutes. The diameters and edge profile at any meridian photographed can subsequently be analyzed from the film. The precision in measuring the diameters of well cut corneal buttons is +/- 23 microns, and in measuring the angle of the edge profile is +/- 1 degree. Statistical analysis of inter observer variability indicated excellent reproducibility of measurements. Shadow photogrammetry offers a standardized, accurate, and reproducible method for analysis of corneal trephination.

  17. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  18. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  19. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  20. Preoperative evaluation and risk factors of lung cancer.

    PubMed

    Gaballo, Annarita; Corbo, Giuseppe M; Valente, Salvatore; Ciappi, Giuliano

    2004-01-01

    Based on a review of the literature on resectable lung cancer, pulmonary risk factors before, during and after surgery are discussed. The role of preoperative evaluation in order to determine the patient ability to withstand radical resection is considered. Spirometric indexes as forced expired volume (FEV1) and diffusing lung carbon monoxide capacity (DLCO) should be measured first. If FEV1 and DLCO are > 60% of predicted, patients are at low risk for complications and can undergo pulmonary resection. However, if FEV1 and DLCO are <60% of predicted, further evaluation with a quantitative lung scan is required. If predicted postoperative values for FEV1 and DLCO are >40%, patients can undergo lung resection, otherwise exercise testing is necessary. If the latter shows maximal oxygen uptake (VO2max) of > 15ml/Kg, surgery can be performed; if VO2max is <15 ml/Kg, patients are inoperable.

  1. Abandoned metal mine stability risk evaluation.

    PubMed

    Bétournay, Marc C

    2009-10-01

    The abandoned mine legacy is critical in many countries around the world, where mine cave-ins and surface subsidence disruptions are perpetual risks that can affect the population, infrastructure, historical legacies, land use, and the environment. This article establishes abandoned metal mine failure risk evaluation approaches and quantification techniques based on the Canadian mining experience. These utilize clear geomechanics considerations such as failure mechanisms, which are dependent on well-defined rock mass parameters. Quantified risk is computed using probability of failure (probabilistics using limit-equilibrium factors of safety or applicable numerical modeling factor of safety quantifications) times a consequence impact value. Semi-quantified risk can be based on failure-case-study-based empirical data used in calculating probability of failure, and personal experience can provide qualified hazard and impact consequence assessments. The article provides outlines for land use and selection of remediation measures based on risk.

  2. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  3. Quantitative relations between risk, return and firm size

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Stanley, H. E.

    2009-03-01

    We analyze —for a large set of stocks comprising four financial indices— the annual logarithmic growth rate R and the firm size, quantified by the market capitalization MC. For the Nasdaq Composite and the New York Stock Exchange Composite we find that the probability density functions of growth rates are Laplace ones in the broad central region, where the standard deviation σ(R), as a measure of risk, decreases with the MC as a power law σ(R)~(MC)- β. For both the Nasdaq Composite and the S&P 500, we find that the average growth rate langRrang decreases faster than σ(R) with MC, implying that the return-to-risk ratio langRrang/σ(R) also decreases with MC. For the S&P 500, langRrang and langRrang/σ(R) also follow power laws. For a 20-year time horizon, for the Nasdaq Composite we find that σ(R) vs. MC exhibits a functional form called a volatility smile, while for the NYSE Composite, we find power law stability between σ(r) and MC.

  4. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  5. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well.

  6. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  7. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  8. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  9. Quantitative evaluation of activation state in functional brain imaging.

    PubMed

    Hu, Zhenghui; Ni, Pengyu; Liu, Cong; Zhao, Xiaohu; Liu, Huafeng; Shi, Pengcheng

    2012-10-01

    Neuronal activity can evoke the hemodynamic change that gives rise to the observed functional magnetic resonance imaging (fMRI) signal. These increases are also regulated by the resting blood volume fraction (V (0)) associated with regional vasculature. The activation locus detected by means of the change in the blood-oxygen-level-dependent (BOLD) signal intensity thereby may deviate from the actual active site due to varied vascular density in the cortex. Furthermore, conventional detection techniques evaluate the statistical significance of the hemodynamic observations. In this sense, the significance level relies not only upon the intensity of the BOLD signal change, but also upon the spatially inhomogeneous fMRI noise distribution that complicates the expression of the results. In this paper, we propose a quantitative strategy for the calibration of activation states to address these challenging problems. The quantitative assessment is based on the estimated neuronal efficacy parameter [Formula: see text] of the hemodynamic model in a voxel-by-voxel way. It is partly immune to the inhomogeneous fMRI noise by virtue of the strength of the optimization strategy. Moreover, it is easy to incorporate regional vascular information into the activation detection procedure. By combining MR angiography images, this approach can remove large vessel contamination in fMRI signals, and provide more accurate functional localization than classical statistical techniques for clinical applications. It is also helpful to investigate the nonlinear nature of the coupling between synaptic activity and the evoked BOLD response. The proposed method might be considered as a potentially useful complement to existing statistical approaches.

  10. Land application of manure and Class B biosolids: an occupational and public quantitative microbial risk assessment.

    PubMed

    Brooks, John P; McLaughlin, Michael R; Gerba, Charles P; Pepper, Ian L

    2012-01-01

    Land application is a practical use of municipal Class B biosolids and manure that also promotes soil fertility and productivity. To date, no study exists comparing biosolids to manure microbial risks. This study used quantitative microbial risk assessment to estimate pathogen risks from occupational and public exposures during scenarios involving fomite, soil, crop, and aerosol exposures. Greatest one-time risks were from direct consumption of contaminated soil or exposure to fomites, with one-time risks greater than 10. Recent contamination and high exposures doses increased most risks. and enteric viruses provided the greatest single risks for most scenarios, particularly in the short term. All pathogen risks were decreased with time, 1 d to14 mo between land application and exposure; decreases in risk were typically over six orders of magnitude beyond 30 d. Nearly all risks were reduced to below 10 when using a 4-mo harvest delay for crop consumption. Occupational, more direct risks were greater than indirect public risks, which often occur after time and dilution have reduced pathogen loads to tolerable levels. Comparison of risks by pathogen group confirmed greater bacterial risks from manure, whereas viral risks were exclusive to biosolids. A direct comparison of the two residual types showed that biosolids use had greater risk because of the high infectivity of viruses, whereas the presence of environmentally recalcitrant pathogens such as and maintained manure risk. Direct comparisons of shared pathogens resulted in greater manure risks. Overall, it appears that in the short term, risks were high for both types of residuals, but given treatment, attenuation, and dilution, risks can be reduced to near-insignificant levels. That being said, limited data sets, dose exposures, site-specific inactivation rates, pathogen spikes, environmental change, regrowth, and wildlife will increase risk and uncertainty and remain areas poorly understood.

  11. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  12. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability.

  13. Usefulness of quantitative versus qualitative ST-segment depression for risk stratification of non-ST elevation acute coronary syndromes in contemporary clinical practice.

    PubMed

    Yan, Raymond T; Yan, Andrew T; Granger, Christopher B; Lopez-Sendon, Jose; Brieger, David; Kennelly, Brian; Budaj, Andrzej; Steg, Ph Gabriel; Georgescu, Alina A; Hassan, Quamrul; Goodman, Shaun G

    2008-04-01

    This aim of this study was to assess the clinical utility of quantitative ST-segment depression (STD) for refining the risk stratification of non-ST elevation acute coronary syndromes in the prospective, multinational Global Registry of Acute Coronary Events (GRACE). Quantitative measurements of STD on admission electrocardiograms were evaluated independently by a core laboratory, and their predictive value for in-hospital and cumulative 6-month mortality was examined. Although more severe STD is a marker of increased short- and long-term mortality, it is also associated with higher risk clinical features and biomarkers. Thus, after adjustment for these clinically important predictors, quantitative STD does not provide incremental prognostic value beyond simple dichotomous evaluation for the presence of STD. Furthermore, adopting quantitative instead of the prognostically proven qualitative evaluation of STD does not improve risk discrimination afforded by the validated GRACE risk models. In conclusion, the findings do not support the quantification of STD in routine clinical practice beyond simple evaluation for the presence of STD as an integral part of comprehensive risk stratification using the GRACE risk score.

  14. Spatially quantitative models for vulnerability analyses and resilience measures in flood risk management: Case study Rafina, Greece

    NASA Astrophysics Data System (ADS)

    Karagiorgos, Konstantinos; Chiari, Michael; Hübl, Johannes; Maris, Fotis; Thaler, Thomas; Fuchs, Sven

    2013-04-01

    We will address spatially quantitative models for vulnerability analyses in flood risk management in the catchment of Rafina, 25 km east of Athens, Greece; and potential measures to reduce damage costs. The evaluation of flood damage losses is relatively advanced. Nevertheless, major problems arise since there are no market prices for the evaluation process available. Moreover, there is particular gap in quantifying the damages and necessary expenditures for the implementation of mitigation measures with respect to flash floods. The key issue is to develop prototypes for assessing flood losses and the impact of mitigation measures on flood resilience by adjusting a vulnerability model and to further develop the method in a Mediterranean region influenced by both, mountain and coastal characteristics of land development. The objective of this study is to create a spatial and temporal analysis of the vulnerability factors based on a method combining spatially explicit loss data, data on the value of exposed elements at risk, and data on flood intensities. In this contribution, a methodology for the development of a flood damage assessment as a function of the process intensity and the degree of loss is presented. It is shown that (1) such relationships for defined object categories are dependent on site-specific and process-specific characteristics, but there is a correlation between process types that have similar characteristics; (2) existing semi-quantitative approaches of vulnerability assessment for elements at risk can be improved based on the proposed quantitative method; and (3) the concept of risk can be enhanced with respect to a standardised and comprehensive implementation by applying the vulnerability functions to be developed within the proposed research. Therefore, loss data were collected from responsible administrative bodies and analysed on an object level. The used model is based on a basin scale approach as well as data on elements at risk exposed

  15. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies.

  16. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment.

  17. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  18. Evaluating Potential Health Risks in Relocatable Classrooms.

    ERIC Educational Resources Information Center

    Katchen, Mark; LaPierre, Adrienne; Charlin, Cary; Brucker, Barry; Ferguson, Paul

    2001-01-01

    Only limited data exist describing potential exposures to chemical and biological agents when using portable classrooms or outlining how to assess and reduce associated health risks. Evaluating indoor air quality involves examining ventilating rates, volatile organic compounds, and microbiologicals. Open communication among key stakeholders is…

  19. Two criteria for evaluating risk prediction models.

    PubMed

    Pfeiffer, R M; Gail, M H

    2011-09-01

    We propose and study two criteria to assess the usefulness of models that predict risk of disease incidence for screening and prevention, or the usefulness of prognostic models for management following disease diagnosis. The first criterion, the proportion of cases followed PCF (q), is the proportion of individuals who will develop disease who are included in the proportion q of individuals in the population at highest risk. The second criterion is the proportion needed to follow-up, PNF (p), namely the proportion of the general population at highest risk that one needs to follow in order that a proportion p of those destined to become cases will be followed. PCF (q) assesses the effectiveness of a program that follows 100q% of the population at highest risk. PNF (p) assess the feasibility of covering 100p% of cases by indicating how much of the population at highest risk must be followed. We show the relationship of those two criteria to the Lorenz curve and its inverse, and present distribution theory for estimates of PCF and PNF. We develop new methods, based on influence functions, for inference for a single risk model, and also for comparing the PCFs and PNFs of two risk models, both of which were evaluated in the same validation data.

  20. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  1. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  2. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  3. [Quantitative evaluation of soil hyperspectra denoising with different filters].

    PubMed

    Huang, Ming-Xiang; Wang, Ke; Shi, Zhou; Gong, Jian-Hua; Li, Hong-Yi; Chen, Jie-Liang

    2009-03-01

    The noise distribution of soil hyperspectra measured by ASD FieldSpec Pro FR was described, and then the quantitative evaluation of spectral denoising with six filters was compared. From the interpretation of soil hyperspectra, the continuum removed, first-order differential and high frequency curves, the UV/VNIR (350-1 050 nm) exhibit hardly noise except the coverage of 40 nm in the beginning 350 nm. However, the SWIR (1 000-2 500 nm) shows different noise distribution. Especially, the latter half of SWIR 2(1 800-2 500 nm) showed more noise, and the intersection spectrum of three spectrometers has more noise than the neighbor spectrum. Six filters were chosen for spectral denoising. The smoothing indexes (SI), horizontal feature reservation index (HFRI) and vertical feature reservation index (VFRI) were designed for evaluating the denoising performance of these filters. The comparison of their indexes shows that WD and MA filters are the optimal choice to filter the noise, in terms of balancing the contradiction between the smoothing and feature reservation ability. Furthermore the first-order differential data of 66 denoising soil spectra by 6 filters were respectively used as the input of the same PLSR model to predict the sand content. The different prediction accuracies caused by the different filters show that compared to the feature reservation ability, the filter's smoothing ability is the principal factor to influence the accuracy. The study can benefit the spectral preprocessing and analyzing, and also provide the scientific foundation for the related spectroscopy applications.

  4. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited

  5. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  6. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  7. Use of global sensitivity analysis in quantitative microbial risk assessment: application to the evaluation of a biological time temperature integrator as a quality and safety indicator for cold smoked salmon.

    PubMed

    Ellouze, M; Gauchi, J-P; Augustin, J-C

    2011-06-01

    The aim of this study was to apply a global sensitivity analysis (SA) method in model simplification and to evaluate (eO)®, a biological Time Temperature Integrator (TTI) as a quality and safety indicator for cold smoked salmon (CSS). Models were thus developed to predict the evolutions of Listeria monocytogenes and the indigenous food flora in CSS and to predict TTIs endpoint. A global SA was then applied on the three models to identify the less important factors and simplify the models accordingly. Results showed that the subset of the most important factors of the three models was mainly composed of the durations and temperatures of two chill chain links, out of the control of the manufacturers: the domestic refrigerator and the retail/cabinet links. Then, the simplified versions of the three models were run with 10(4) time temperature profiles representing the variability associated to the microbial behavior, to the TTIs evolution and to the French chill chain characteristics. The results were used to assess the distributions of the microbial contaminations obtained at the TTI endpoint and at the end of the simulated profiles and proved that, in the case of poor storage conditions, the TTI use could reduce the number of unacceptable foods by 50%.

  8. Quantitative risk assessment of Vibrio parahaemolyticus in finfish: a model of raw horse mackerel consumption in Japan.

    PubMed

    Iwahori, Jun'ichiro; Yamamoto, Akio; Suzuki, Hodaka; Yamamoto, Takehisa; Tsutsui, Toshiyuki; Motoyama, Keiko; Sawada, Mikiko; Matsushita, Tomoki; Hasegawa, Atsushi; Osaka, Ken; Toyofuku, Hajime; Kasuga, Fumiko

    2010-12-01

    The aim of this study was to evaluate the effects of implemented control measures to reduce illness induced by Vibrio parahaemolyticus (V. parahaemolyticus) in horse mackerel (Trachurus japonicus), seafood that is commonly consumed raw in Japan. On the basis of currently available experimental and survey data, we constructed a quantitative risk model of V. parahaemolyticus in horse mackerel from harvest to consumption. In particular, the following factors were evaluated: bacterial growth at all stages, effects of washing the fish body and storage water, and bacterial transfer from the fish surface, gills, and intestine to fillets during preparation. New parameters of the beta-Poisson dose-response model were determined from all human feeding trials, some of which have been used for risk assessment by the U.S. Food and Drug Administration (USFDA). The probability of illness caused by V. parahaemolyticus was estimated using both the USFDA dose-response parameters and our parameters for each selected pathway of scenario alternatives: washing whole fish at landing, storage in contaminated water, high temperature during transportation, and washing fish during preparation. The last scenario (washing fish during preparation) was the most effective for reducing the risk of illness by about a factor of 10 compared to no washing at this stage. Risk of illness increased by 50% by exposure to increased temperature during transportation, according to our assumptions of duration and temperature. The other two scenarios did not significantly affect risk. The choice of dose-response parameters was not critical for evaluation of control measures.

  9. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  10. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  11. Risk evaluation: A cost-oriented approach

    SciTech Connect

    Rogers, B.H.

    1998-02-03

    This method provides a structured and cost-oriented way to determine risks associated with loss and destruction of industrial security interests consisting of material assets and human resources. Loss and destruction are assumed to be adversary perpetrated, high-impact events in which the health and safety of people or high-value property is at risk. This concept provides a process for: (1) assessing effectiveness of all integrated protection system, which includes facility operations, safety, emergency and security systems, and (2) a qualitative prioritization scheme to determine the level of consequence relative to cost and subsequent risk. The method allows managers the flexibility to establish asset protection appropriate to programmatic requirements and priorities and to decide if funding is appropriate. The evaluation objectives are to: (1) provide for a systematic, qualitative tabletop process to estimate the potential for an undesirable event and its impact; and (2) identify ineffective protection and cost-effective solutions.

  12. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction.

  13. Modified risk evaluation method. Revision 1

    SciTech Connect

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection.

  14. Evaluating Risk Communication After the Fukushima Disaster Based on Nudge Theory.

    PubMed

    Murakami, Michio; Tsubokura, Masaharu

    2017-03-01

    Using nudge theory and some examples of risk communication that followed the Fukushima disaster, this article discusses the influences and justifications of risk communication, in addition to how risk communication systems are designed. To assist people in making decisions based on their own value systems, we provide three suggestions, keeping in mind that people can be influenced (ie, "nudged") depending on how risk communication takes place: (1) accumulate knowledge on the process of evaluating how the method of risk communication and a system's default design could impact people; (2) clarify the purpose and outcomes of risk communication; and (3) see what risk communication might be ethically unjustifiable. Quantitative studies on risk communication and collective narratives will provide some ideas for how to design better risk communication systems and to help people make decisions. Furthermore, we have shown examples of unjustifiable risk communication.

  15. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  16. Quantitative evaluation of hybridization and the impact on biodiversity conservation.

    PubMed

    van Wyk, Anna M; Dalton, Desiré L; Hoban, Sean; Bruford, Michael W; Russo, Isa-Rita M; Birss, Coral; Grobler, Paul; van Vuuren, Bettine Janse; Kotzé, Antoinette

    2017-01-01

    Anthropogenic hybridization is an increasing conservation threat worldwide. In South Africa, recent hybridization is threatening numerous ungulate taxa. For example, the genetic integrity of the near-threatened bontebok (Damaliscus pygargus pygargus) is threatened by hybridization with the more common blesbok (D. p. phillipsi). Identifying nonadmixed parental and admixed individuals is challenging based on the morphological traits alone; however, molecular analyses may allow for accurate detection. Once hybrids are identified, population simulation software may assist in determining the optimal conservation management strategy, although quantitative evaluation of hybrid management is rarely performed. In this study, our objectives were to describe species-wide and localized rates of hybridization in nearly 3,000 individuals based on 12 microsatellite loci, quantify the accuracy of hybrid assignment software (STRUCTURE and NEWHYBRIDS), and determine an optimal threshold of bontebok ancestry for management purposes. According to multiple methods, we identified 2,051 bontebok, 657 hybrids, and 29 blesbok. More than two-thirds of locations contained at least some hybrid individuals, with populations varying in the degree of introgression. HYBRIDLAB was used to simulate four generations of coexistence between bontebok and blesbok, and to optimize a threshold of ancestry, where most hybrids will be detected and removed, and the fewest nonadmixed bontebok individuals misclassified as hybrids. Overall, a threshold Q-value (admixture coefficient) of 0.90 would remove 94% of hybrid animals, while a threshold of 0.95 would remove 98% of hybrid animals but also 8% of nonadmixed bontebok. To this end, a threshold of 0.90 was identified as optimal and has since been implemented in formal policy by a provincial nature conservation agency. Due to widespread hybridization, effective conservation plans should be established and enforced to conserve native populations that are

  17. Quantitative imaging to evaluate malignant potential of IPMNs

    PubMed Central

    Hanania, Alexander N.; Bantis, Leonidas E.; Feng, Ziding; Wang, Huamin; Tamm, Eric P.; Katz, Matthew H.; Maitra, Anirban; Koay, Eugene J.

    2016-01-01

    Objective To investigate using quantitative imaging to assess the malignant potential of intraductal papillary mucinous neoplasms (IPMNs) in the pancreas. Background Pancreatic cysts are identified in over 2% of the population and a subset of these, including intraductal papillary mucinous neoplasms (IPMNs), represent pre-malignant lesions. Unfortunately, clinicians cannot accurately predict which of these lesions are likely to progress to pancreatic ductal adenocarcinoma (PDAC). Methods We investigated 360 imaging features within the domains of intensity, texture and shape using pancreatic protocol CT images in 53 patients diagnosed with IPMN (34 “high-grade” [HG] and 19 “low-grade” [LG]) who subsequently underwent surgical resection. We evaluated the performance of these features as well as the Fukuoka criteria for pancreatic cyst resection. Results In our cohort, the Fukuoka criteria had a false positive rate of 36%. We identified 14 imaging biomarkers within Gray-Level Co-Occurrence Matrix (GLCM) that predicted histopathological grade within cyst contours. The most predictive marker differentiated LG and HG lesions with an area under the curve (AUC) of .82 at a sensitivity of 85% and specificity of 68%. Using a cross-validated design, the best logistic regression yielded an AUC of 0.96 (σ = .05) at a sensitivity of 97% and specificity of 88%. Based on the principal component analysis, HG IPMNs demonstrated a pattern of separation from LG IPMNs. Conclusions HG IPMNs appear to have distinct imaging properties. Further validation of these findings may address a major clinical need in this population by identifying those most likely to benefit from surgical resection. PMID:27588410

  18. [Evaluating individual occupational risk in teachers].

    PubMed

    Stepanov, E G; Ishmukhametov, I B

    2012-01-01

    The authors analyzed work conditions of comprehensive school teachers according to workplace assessment. Additional studies covered opportunistic pathogens content of air in classrooms. Auxiliary medical examination evaluated health state of the teachers. Individual occupational risk was calculated with consideration of actual work conditions and health state. Comprehensive school teacher's work is characterized by constant or transitory influence by complex of occupational and work hazards that are mostly (according to to workplace assessment) increased work intensity, noise and inadequate illumination parameters. Ambient air of classrooms constantly contains high number of opportunistic pathogens, that could decrease immune system parameters and cause more droplet infections. Individual occupational risk of teachers, calculated with consideration of work conditions and health state parameters, appears to be high and proves high possibility of teachers' health damage at work. Recommendations cover evaluation of biologic factors within the workplace assessment, obligatory preliminary (before employment) and periodic medical examinations for comprehensive school teachers as for workers exposed to occupational hazards.

  19. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  20. Quantitative Risks

    DTIC Science & Technology

    2015-02-24

    artifacts: the System Segment Specification ( SSS ), the Specification Tree (ST), the System Architecture (SA), Technical Review Checklists (TRC), and the...element, and/or by IMP entry. By including maturity advancement steps in the IMP, cost and schedule of maturity advancement is tracked. The SSS ...begins with the performance specification (P-Spec), a part of the RFP package. The SSS contains derived requirements for system segments of the

  1. Development of a quantitative microbial risk assessment for human salmonellosis through household consumption of fresh minced pork meat in Belgium.

    PubMed

    Bollaerts, Kaatje Els; Messens, Winy; Delhalle, Laurent; Aerts, Marc; Van der Stede, Yves; Dewulf, Jeroen; Quoilin, Sophie; Maes, Dominiek; Mintiens, Koen; Grijspeerdt, Koen

    2009-06-01

    A quantitative microbial risk assessment (QMRA) according to the Codex Alimentarius Principles is conducted to evaluate the risk of human salmonellosis through household consumption of fresh minced pork meat in Belgium. The quantitative exposure assessment is carried out by building a modular risk model, called the METZOON-model, which covers the pork production from farm to fork. In the METZOON-model, the food production pathway is split up in six consecutive modules: (1) primary production, (2) transport and lairage, (3) slaughterhouse, (4) postprocessing, (5) distribution and storage, and (6) preparation and consumption. All the modules are developed to resemble as closely as possible the Belgian situation, making use of the available national data. Several statistical refinements and improved modeling techniques are proposed. The model produces highly realistic results. The baseline predicted number of annual salmonellosis cases is 20,513 (SD 9061.45). The risk is estimated higher for the susceptible population (estimate 4.713 x 10(-5); SD 1.466 x 10(-5)) compared to the normal population (estimate 7.704 x 10(-6); SD 5.414 x 10(-6)) and is mainly due to undercooking and to a smaller extent to cross-contamination in the kitchen via cook's hands.

  2. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  3. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  4. [Evaluation of noise risk in roadmen].

    PubMed

    Giorgianni, C; Trimboli, K; Tanzariello, G; Fontana, G; Abbate, S; Callipari, S; Galtieri, G

    2007-01-01

    The activity of roadman is not much knowed. The evalutation of the risk evidences the noise and vibrations as the principal polluter. The aim of our study is to go into the knowledge of the working risk of that job. The study was leaded on a sample of 287 roadmen. It was carried out an evalutation of noise exposure of the workers and a biologic evaluation that included: Medical check. Othorhin.laryngoiatric examination. Audiometric examination. Impedenzometric examination. The audiometric exam was conduced in a silent box and with a acoustic rest at least 16 h. The audiometric test were classified in conformity with Merluzzi-Pira-Bosio method. The phonometric evaluations point out, in roadmans examinated, hight exposure to noise with mean Lep d > 90 db. Biological evaluation, through medical check, showed data similar with that of a general group homogeneous for age and sex. Audiometric evaluation showed a noise acoustic trama in 60% of sample. Inpedenzometric exam confirmed the absence of transmissive damage in almost whole sample. In conclusion we can adfirm that noise is a strong factor for roadmen, with evident damage to exposed workers.

  5. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  6. Veterinary drugs: disposition, biotransformation and risk evaluation.

    PubMed

    Fink-Gremmels, J; van Miert, A S

    1994-12-01

    Veterinary drugs may only be produced, distributed and administered after being licensed. This implies that, prior to marketing, a critical evaluation of the pharmaceutical quality, the clinical efficacy and the over-all pharmacological and toxicological properties of the active substances will be performed by national and/or supranational authorities. However, despite a sophisticated legal (harmonized) framework, a number of factors involved in residue formation and safety assessment remain unpredictable or dependant on the current 'state of the art' in the understanding of molecular pharmacology and toxicology. For example, drug disposition and residue formation in the target animal species may be influenced by a broad variety of physiological parameters including age, sex and diet, as well as by pathological conditions especially the acute phase response to infection. These factors affect both drug disposition and metabolite formation. Furthermore, current thinking in toxicological risk assessment is influenced by recent developments in molecular toxicology and thus by an increased but still incomplete understanding of the interaction of a toxic compound with the living organism. General recognized principles in the evaluation of potential toxicants are applied in the recommendation of withdrawal times and the establishment of maximum residue limits (MRL values). Apart from toxicological-based assessment, increasing awareness is directed to other than toxicological responses, especially the potential risk of effects of antimicrobial residues on human gastrointestinal microflora. Thus, the methodology of risk assessment is discussed in the context of the recently established legal framework within the European Union.

  7. [Thermal comfort in perioperatory risk's evaluation].

    PubMed

    Masia, M D; Dettori, M; Liperi, G; Deriu, G M; Posadino, S; Maida, G; Mura, I

    2009-01-01

    Studies till now conducted about operating rooms' microclimate have been focused mainly on operators' thermal comfort, considering that uneasiness conditions may compromise their working performance. In last years, nevertheless, the anesthesiologic community recalled attention on patients' risks determined by perioperatory variations of normothermia, underlining the necessity of orientating studies to individuate microclimate characteristics act to guarantee thermal comfort of the patient too. Looking at these considerations, a study has been conducted in the operating rooms of the hospital-university Firm and the n.1 USL of Sassari, finalized, on one hand, to determinate microclimate characteristics of the operating blocks and to evaluate operators' and patients' thermal comfort, on the other to individuate, through a software simulation, microclimate conditions that ensure contemporarily thermal comfort for both the categories. Results confirm the existence of a thermal "gap" among operators and patients, these last constantly submitted to "cold-stress", sometimes very accentuated. So, we underline microclimate's importance in operating rooms, because there are particular situations that can condition perioperatory risks. Moreover it can be useful to integrate risk's classes of the American Society of Anestesiology (ASA) with a score attributed to the PMV/PPD variation, reaching more real operatory risk indicators.

  8. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues.

  9. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  10. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  11. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  12. Evaluating IPMN and pancreatic carcinoma utilizing quantitative histopathology.

    PubMed

    Glazer, Evan S; Zhang, Hao Helen; Hill, Kimberly A; Patel, Charmi; Kha, Stephanie T; Yozwiak, Michael L; Bartels, Hubert; Nafissi, Nellie N; Watkins, Joseph C; Alberts, David S; Krouse, Robert S

    2016-10-01

    Intraductal papillary mucinous neoplasms (IPMN) are pancreatic lesions with uncertain biologic behavior. This study sought objective, accurate prediction tools, through the use of quantitative histopathological signatures of nuclear images, for classifying lesions as chronic pancreatitis (CP), IPMN, or pancreatic carcinoma (PC). Forty-four pancreatic resection patients were retrospectively identified for this study (12 CP; 16 IPMN; 16 PC). Regularized multinomial regression quantitatively classified each specimen as CP, IPMN, or PC in an automated, blinded fashion. Classification certainty was determined by subtracting the smallest classification probability from the largest probability (of the three groups). The certainty function varied from 1.0 (perfectly classified) to 0.0 (random). From each lesion, 180 ± 22 nuclei were imaged. Overall classification accuracy was 89.6% with six unique nuclear features. No CP cases were misclassified, 1/16 IPMN cases were misclassified, and 4/16 PC cases were misclassified. Certainty function was 0.75 ± 0.16 for correctly classified lesions and 0.47 ± 0.10 for incorrectly classified lesions (P = 0.0005). Uncertainty was identified in four of the five misclassified lesions. Quantitative histopathology provides a robust, novel method to distinguish among CP, IPMN, and PC with a quantitative measure of uncertainty. This may be useful when there is uncertainty in diagnosis.

  13. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: II. Risk characterization.

    PubMed

    Pouillot, Régis; Goulet, Véronique; Delignette-Muller, Marie Laure; Mahé, Aurélie; Cornu, Marie

    2009-06-01

    A model for the assessment of exposure to Listeria monocytogenes from cold-smoked salmon consumption in France was presented in the first of this pair of articles (Pouillot et al., 2007, Risk Analysis, 27:683-700). In the present study, the exposure model output was combined with an internationally accepted hazard characterization model, adapted to the French situation, to assess the risk of invasive listeriosis from cold-smoked salmon consumption in France in a second-order Monte Carlo simulation framework. The annual number of cases of invasive listeriosis due to cold-smoked salmon consumption in France is estimated to be 307, with a very large credible interval ([10; 12,453]), reflecting data uncertainty. This uncertainty is mainly associated with the dose-response model. Despite the significant uncertainty associated with the predictions, this model provides a scientific base for risk managers and food business operators to manage the risk linked to cold-smoked salmon contaminated with L. monocytogenes. Under the modeling assumptions, risk would be efficiently reduced through a decrease in the prevalence of L. monocytogenes or better control of the last steps of the cold chain (shorter and/or colder storage during the consumer step), whereas reduction of the initial contamination levels of the contaminated products and improvement in the first steps of the cold chain do not seem to be promising strategies. An attempt to apply the recent risk-based concept of FSO (food safety objective) on this example underlines the ambiguity in practical implementation of the risk management metrics and the need for further elaboration on these concepts.

  14. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible

  15. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  16. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  17. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  18. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production.

  19. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  20. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  1. Designs for Risk Evaluation and Management

    SciTech Connect

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.

  2. Evaluation of residue drum storage safety risks

    SciTech Connect

    Conner, W.V.

    1994-06-17

    A study was conducted to determine if any potential safety problems exist in the residue drum backlog at the Rocky Flats Plant. Plutonium residues stored in 55-gallon drums were packaged for short-term storage until the residues could be processed for plutonium recovery. These residues have now been determined by the Department of Energy to be waste materials, and the residues will remain in storage until plans for disposal of the material can be developed. The packaging configurations which were safe for short-term storage may not be safe for long-term storage. Interviews with Rocky Flats personnel involved with packaging the residues reveal that more than one packaging configuration was used for some of the residues. A tabulation of packaging configurations was developed based on the information obtained from the interviews. A number of potential safety problems were identified during this study, including hydrogen generation from some residues and residue packaging materials, contamination containment loss, metal residue packaging container corrosion, and pyrophoric plutonium compound formation. Risk factors were developed for evaluating the risk potential of the various residue categories, and the residues in storage at Rocky Flats were ranked by risk potential. Preliminary drum head space gas sampling studies have demonstrated the potential for formation of flammable hydrogen-oxygen mixtures in some residue drums.

  3. Evaluation of the "Respect Not Risk" Firearm Safety Lesson for 3rd-Graders

    ERIC Educational Resources Information Center

    Liller, Karen D.; Perrin, Karen; Nearns, Jodi; Pesce, Karen; Crane, Nancy B.; Gonzalez, Robin R.

    2003-01-01

    The purpose of this study was to evaluate the MORE HEALTH "Respect Not Risk" Firearm Safety Lesson for 3rd-graders in Pinellas County, Florida. Six schools representative of various socioeconomic levels were selected as the test sites. Qualitative and quantitative data were collected. A total of 433 matched pretests/posttests were used…

  4. A quantitative assessment of risks of heavy metal residues in laundered shop towels and their use by workers.

    PubMed

    Connor, Kevin; Magee, Brian

    2014-10-01

    This paper presents a risk assessment of exposure to metal residues in laundered shop towels by workers. The concentrations of 27 metals measured in a synthetic sweat leachate were used to estimate the releasable quantity of metals which could be transferred to workers' skin. Worker exposure was evaluated quantitatively with an exposure model that focused on towel-to-hand transfer and subsequent hand-to-food or -mouth transfers. The exposure model was based on conservative, but reasonable assumptions regarding towel use and default exposure factor values from the published literature or regulatory guidance. Transfer coefficients were derived from studies representative of the exposures to towel users. Contact frequencies were based on assumed high-end use of shop towels, but constrained by a theoretical maximum dermal loading. The risk estimates for workers developed for all metals were below applicable regulatory risk benchmarks. The risk assessment for lead utilized the Adult Lead Model and concluded that predicted lead intakes do not constitute a significant health hazard based on potential worker exposures. Uncertainties are discussed in relation to the overall confidence in the exposure estimates developed for each exposure pathway and the likelihood that the exposure model is under- or overestimating worker exposures and risk.

  5. Chairside quantitative immunochromatographic evaluation of salivary cotinine and its correlation with chronic periodontitis

    PubMed Central

    Surya, Chamarthi; Swamy, Devulapally Narasimha; Chakrapani, Swarna; Kumar, Surapaneni Sunil

    2012-01-01

    Background: Cigarette smoking is an established and modifiable risk factor for periodontitis. Periodontitis appears to be dose-dependent on smoking. The purpose of this study was to assess a reliable marker of tobacco smoke exposure (salivary cotinine) chairside and to confirm the quantitative association between smoking and chronic periodontitis. Materials and Methods: Saliva samples from 80 males, aged 30–60 years, with chronic periodontitis, were evaluated chairside using NicAlert™ cotinine test strips (NCTS). Patients were divided into two groups: A (cotinine negative) and B (cotinine positive). Plaque index (PI), Gingival index (GI), gingival bleeding index (GBI), probing pocket depth (PPD), clinical attachment level (CAL), and gingival recession (GR) were compared between the two groups and among the subjects of group B. Results: Comparison showed that the severity of PPD (P<0.001), CAL (P<0.001), and GR (P<0.001) was more in group B than in group A. Severity of all periodontal parameters increased with increased salivary cotinine among the subjects in group B. Conclusion: Quantitative direct association can be established between salivary cotinine and the severity of periodontitis. Immunochromatography-based cotinine test strips are a relatively easy method for quantification of salivary cotinine chairside. Immediate and personalized feedback from a chairside test can improve compliance, quit rates, and ease reinforcing smoking cessation. PMID:23492903

  6. Comparison of Risk Predicted by Multiple Norovirus Dose-Response Models and Implications for Quantitative Microbial Risk Assessment.

    PubMed

    Van Abel, Nicole; Schoen, Mary E; Kissel, John C; Meschke, J Scott

    2016-06-10

    The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide. A key component of QMRA is the dose-response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose-response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose-response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose-response models. The results found that the majority of published QMRAs of norovirus use the 1 F1 hypergeometric dose-response model with α = 0.04, β = 0.055. This dose-response model predicted relatively high risk estimates compared to other dose-response models for doses in the range of 1-1,000 genomic equivalent copies. The difference in predicted risk among dose-response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low. Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose-response models in QMRA of norovirus. Finally, in the absence of one best norovirus dose-response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.

  7. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  8. Promising quantitative nondestructive evaluation techniques for composite materials

    NASA Technical Reports Server (NTRS)

    Williams, J. H., Jr.; Lee, S. S.

    1985-01-01

    Some recent results in the area of the ultrasonic, acoustic emission, thermographic, and acousto-ultrasonic NDE of composites are reviewed. In particular, attention is given to the progress in the use of ultrasonic attenuation, acoustic emission (parameter) delay, liquid-crystal thermography, and the stress wave factor in structural integrity monitoring of composite materials. The importance of NDE flaw significance characterizations is emphasized since such characterizations can directly indicate the appropriate NDE technique sensitivity requirements. The role of the NDE of flawed composites with and without overt defects in establishing quantitative accept/reject criteria for structural integrity assessment is discussed.

  9. Quantitative evaluation of the transplanted lin(-) hematopoietic cell migration kinetics.

    PubMed

    Kašėta, Vytautas; Vaitkuvienė, Aida; Liubavičiūtė, Aušra; Maciulevičienė, Rūta; Stirkė, Arūnas; Biziulevičienė, Genė

    2016-02-01

    Stem cells take part in organogenesis, cell maturation and injury repair. The migration is necessary for each of these functions to occur. The aim of this study was to investigate the kinetics of transplanted hematopoietic lin(-) cell population (which consists mainly of the stem and progenitor cells) in BALB/c mouse contact hypersensitivity model and quantify the migration to the site of inflammation in the affected foot and other healthy organs. Quantitative analysis was carried out with the real-time polymerase chain reaction method. Spleen, kidney, bone marrow, lung, liver, damaged and healthy foot tissue samples at different time points were collected for analysis. The quantitative data normalization was performed according to the comparative quantification method. The analysis of foot samples shows the significant migration of transplanted cells to the recipient mice affected foot. The quantity was more than 1000 times higher, as compared with that of the untreated foot. Due to the inflammation, the number of donor origin cells migrating to the lungs, liver, spleen and bone marrow was found to be decreased. Our data shows that transplanted cells selectively migrated into the inflammation areas of the foot edema. Also, the inflammation caused a secondary migration in ectopic spleen of hematopoietic stem cell niches and re-homing from the spleen to the bone marrow took place.

  10. Quantitative analytical method to evaluate the metabolism of vitamin D.

    PubMed

    Mena-Bravo, A; Ferreiro-Vera, C; Priego-Capote, F; Maestro, M A; Mouriño, A; Quesada-Gómez, J M; Luque de Castro, M D

    2015-03-10

    A method for quantitative analysis of vitamin D (both D2 and D3) and its main metabolites - monohydroxylated vitamin D (25-hydroxyvitamin D2 and 25-hydroxyvitamin D3) and dihydroxylated metabolites (1,25-dihydroxyvitamin D2, 1,25-dihydroxyvitamin D3 and 24,25-dihydroxyvitamin D3) in human serum is here reported. The method is based on direct analysis of serum by an automated platform involving on-line coupling of a solid-phase extraction workstation to a liquid chromatograph-tandem mass spectrometer. Detection of the seven analytes was carried out by the selected reaction monitoring (SRM) mode, and quantitative analysis was supported on the use of stable isotopic labeled internal standards (SIL-ISs). The detection limits were between 0.3-75pg/mL for the target compounds, while precision (expressed as relative standard deviation) was below 13.0% for between-day variability. The method was externally validated according to the vitamin D External Quality Assurance Scheme (DEQAS) through the analysis of ten serum samples provided by this organism. The analytical features of the method support its applicability in nutritional and clinical studies targeted at elucidating the role of vitamin D metabolism.

  11. Quantitative evaluation of bioorthogonal chemistries for surface functionalization of nanoparticles.

    PubMed

    Feldborg, Lise N; Jølck, Rasmus I; Andresen, Thomas L

    2012-12-19

    We present here a highly efficient and chemoselective liposome functionalization method based on oxime bond formation between a hydroxylamine and an aldehyde-modified lipid component. We have conducted a systematic and quantitative comparison of this new approach with other state-of-the-art conjugation reactions in the field. Targeted liposomes that recognize overexpressed receptors or antigens on diseased cells have great potential in therapeutic and diagnostic applications. However, chemical modifications of nanoparticle surfaces by postfunctionalization approaches are less effective than in solution and often not high-yielding. In addition, the conjugation efficiency is often challenging to characterize and therefore not addressed in many reports. We present here an investigation of PEGylated liposomes functionalized with a neuroendocrine tumor targeting peptide (TATE), synthesized with a variety of functionalities that have been used for surface conjugation of nanoparticles. The reaction kinetics and overall yield were quantified by HPLC. Reactions were conducted in solution as well as by postfunctionalization of liposomes in order to study the effects of steric hindrance and possible affinity between the peptide and the liposome surface. These studies demonstrate the importance of choosing the correct chemistry in order to obtain a quantitative surface functionalization of liposomes.

  12. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  13. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  14. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection.

  15. The Quantitative Science of Evaluating Imaging Evidence.

    PubMed

    Genders, Tessa S S; Ferket, Bart S; Hunink, M G Myriam

    2017-03-01

    Cardiovascular diagnostic imaging tests are increasingly used in everyday clinical practice, but are often imperfect, just like any other diagnostic test. The performance of a cardiovascular diagnostic imaging test is usually expressed in terms of sensitivity and specificity compared with the reference standard (gold standard) for diagnosing the disease. However, evidence-based application of a diagnostic test also requires knowledge about the pre-test probability of disease, the benefit of making a correct diagnosis, the harm caused by false-positive imaging test results, and potential adverse effects of performing the test itself. To assist in clinical decision making regarding appropriate use of cardiovascular diagnostic imaging tests, we reviewed quantitative concepts related to diagnostic performance (e.g., sensitivity, specificity, predictive values, likelihood ratios), as well as possible biases and solutions in diagnostic performance studies, Bayesian principles, and the threshold approach to decision making.

  16. Quantitative evaluation of magnetic immunoassay with remanence measurement

    NASA Astrophysics Data System (ADS)

    Enpuku, K.; Soejima, K.; Nishimoto, T.; Kuma, H.; Hamasaki, N.; Tsukamoto, A.; Saitoh, K.; Kandori, A.

    2006-05-01

    Magnetic immunoassays utilizing magnetic markers and a high -Tc SQUID have been performed. The marker was designed so as to generate remanence, and its remanence field was measured with the SQUID. The SQUID system was developed so as to measure 12 samples in one measurement sequence. We first conducted a detection of antigen called human IgE using IgE standard solution, and showed the detection of IgE down to 2 attomol. The binding process between IgE and the marker could be semi-quantitatively explained with the Langmuir-type adsorption model. We also measured IgE in human serums, and demonstrated the usefulness of the present method for practical diagnosis.

  17. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  18. Evaluation of Quantitative Environmental Stress Screening (ESS) Methods. Volume 1

    DTIC Science & Technology

    1991-11-01

    muu4 The objective of this study was to evaluate Environmental Stress Screening (ESS) techniques contained in DOD-HDBK-344,’ by applying the methodology...to several electronic products during actual factor production. Validation of the techniques , the develop- ment of improved, qi•p’lified,_ad...automated procedures and subsequent revisions to the Handbook were the objectives, qf the evaluation. The Rome Laboratory has developed techniques which

  19. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    PubMed

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations.

  20. Predictive value of quantitative dipyridamole-thallium scintigraphy in assessing cardiovascular risk after vascular surgery in diabetes mellitus

    SciTech Connect

    Lane, S.E.; Lewis, S.M.; Pippin, J.J.; Kosinski, E.J.; Campbell, D.; Nesto, R.W.; Hill, T. )

    1989-12-01

    Cardiac complications represent a major risk to patients undergoing vascular surgery. Diabetic patients may be particularly prone to such complications due to the high incidence of concomitant coronary artery disease, the severity of which may be clinically unrecognized. Attempts to stratify groups by clinical criteria have been useful but lack the predictive value of currently used noninvasive techniques such as dipyridamole-thallium scintigraphy. One hundred one diabetic patients were evaluated with dipyridamole-thallium scintigraphy before undergoing vascular surgery. The incidence of thallium abnormalities was high (80%) and did not correlate with clinical markers of coronary disease. Even in a subgroup of patients with no overt clinical evidence of underlying heart disease, thallium abnormalities were present in 59%. Cardiovascular complications, however, occurred in only 11% of all patients. Statistically significant prediction of risk was not achieved with simple assessment of thallium results as normal or abnormal. Quantification of total number of reversible defects, as well as assessment of ischemia in the distribution of the left anterior descending coronary artery was required for optimum predictive accuracy. The prevalence of dipyridamole-thallium abnormalities in a diabetic population is much higher than that reported in nondiabetic patients and cannot be predicted by usual clinical indicators of heart disease. In addition, cardiovascular risk of vascular surgery can be optimally assessed by quantitative analysis of dipyridamole-thallium scintigraphy and identification of high- and low-risk subgroups.

  1. Quantitative microbial risk assessment to estimate health risks attributable to water supply: can the technique be applied in developing countries with limited data?

    PubMed

    Howard, Guy; Pedley, Steve; Tibatemwa, Sarah

    2006-03-01

    In the 3rd edition of its Guidelines for Drinking-Water Quality (2004) (GDWQ) the World Health Organization (WHO) promotes the use of risk assessment coupled with risk management for the control of water safety in drinking water supplies. Quantitative microbial risk assessment (QMRA) provides a tool for estimating the disease-burden from pathogenic microorganisms in water using information about the distribution and occurrence of the pathogen or an appropriate surrogate. This information may then be used to inform decisions about appropriate management of the water supply system. Although QMRA has been used to estimate disease burden from water supplies in developed countries, the method has not been evaluated in developing countries where relevant data may be scarce. In this paper, we describe a simplified risk assessment procedure to calculate the disease burden from three reference pathogens--pathogenic Escherichia coli, Cryptosporidium parvum and rotavirus--in water supplies in Kampala, Uganda. The study shows how QMRA can be used in countries with limited data, and that the outcome can provide valuable information for the management of water supplies.

  2. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  3. Quantitative assessment of infection risk from exposure to waterborne pathogens in urban floodwater.

    PubMed

    de Man, H; van den Berg, H H J L; Leenen, E J T M; Schijven, J F; Schets, F M; van der Vliet, J C; van Knapen, F; de Roda Husman, A M

    2014-01-01

    Flooding and heavy rainfall have been associated with waterborne infectious disease outbreaks, however, it is unclear to which extent they pose a risk for public health. Here, risks of infection from exposure to urban floodwater were assessed using quantitative microbial risk assessment (QMRA). To that aim, urban floodwaters were sampled in the Netherlands during 23 events in 2011 and 2012. The water contained Campylobacter jejuni (prevalence 61%, range 14- >10(3) MPN/l), Giardia spp. (35%, 0.1-142 cysts/l), Cryptosporidium (30%, 0.1-9.8 oocysts/l), noroviruses (29%, 10(2)-10(4) pdu/l) and enteroviruses (35%, 10(3)-10(4) pdu/l). Exposure data collected by questionnaire, revealed that children swallowed 1.7 ml (mean, 95% Confidence Interval 0-4.6 ml) per exposure event and adults swallowed 0.016 ml (mean, 95% CI 0-0.068 ml) due to hand-mouth contact. The mean risk of infection per event for children, who were exposed to floodwater originating from combined sewers, storm sewers and rainfall generated surface runoff was 33%, 23% and 3.5%, respectively, and for adults it was 3.9%, 0.58% and 0.039%. The annual risk of infection was calculated to compare flooding from different urban drainage systems. An exposure frequency of once every 10 years to flooding originating from combined sewers resulted in an annual risk of infection of 8%, which was equal to the risk of infection of flooding originating from rainfall generated surface runoff 2.3 times per year. However, these annual infection risks will increase with a higher frequency of urban flooding due to heavy rainfall as foreseen in climate change projections.

  4. Packaging and transportation risk management and evaluation plan

    SciTech Connect

    Rhyne, W.R.

    1993-09-01

    Shipments of radioactive materials and hazardous chemicals at the Los Alamos National Laboratory (LANL) are governed by a variety of Federal and state regulations, industrial standards, and LANL processes and procedures. Good judgement is exercised in situations that are not covered by regulations. As a result, the safety record for transporting hazardous materials at LANL has been excellent. However, future decisions should be made such that the decision-making process produces a defensible record of the safety of onsite shipments. This report proposes the development of a risk management tool to meet this need. First, the application of quantitative risk analysis methodology to transportation is presented to provide a framework of understanding. Risk analysis definitions, the basic quantitative risk analysis procedure, quantitative methodologies, transportation data bases, and risk presentation techniques are described. Quantitative risk analysis is frequently complex; but simplified approaches can be used as a management tool to make good decisions. Second, a plan to apply the use of risk management principles to the selection of routes, special administrative controls, and containers for hazardous material transportation at LANL is provided. A risk management tool is proposed that can be used by MAT-2 without substantial support from specialized safety and risk analysis personnel, e.g., HS-3. A workbook approach is proposed that can be automated at a later date. The safety of some types of onsite shipments at LANL is not well documented. Documenting that shipments are safe, i.e., present acceptable risks, will likely require elaborate analyses that should be thoroughly reviewed by safety and risk professionals. These detailed analyses are used as benchmarks and as examples for the use of the proposed tool by MAT-2. Once the benchmarks are established, the workbook can be used by MAT-2 to quantify that safety goals are met by similar shipments.

  5. A quantitative evaluation of confidence measures for stereo vision.

    PubMed

    Hu, Xiaoyan; Mordohai, Philippos

    2012-11-01

    We present an extensive evaluation of 17 confidence measures for stereo matching that compares the most widely used measures as well as several novel techniques proposed here. We begin by categorizing these methods according to which aspects of stereo cost estimation they take into account and then assess their strengths and weaknesses. The evaluation is conducted using a winner-take-all framework on binocular and multibaseline datasets with ground truth. It measures the capability of each confidence method to rank depth estimates according to their likelihood for being correct, to detect occluded pixels, and to generate low-error depth maps by selecting among multiple hypotheses for each pixel. Our work was motivated by the observation that such an evaluation is missing from the rapidly maturing stereo literature and that our findings would be helpful to researchers in binocular and multiview stereo.

  6. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  7. Quantitative evaluation of statistical inference in resting state functional MRI.

    PubMed

    Yang, Xue; Kang, Hakmook; Newton, Allen; Landman, Bennett A

    2012-01-01

    Modern statistical inference techniques may be able to improve the sensitivity and specificity of resting state functional MRI (rs-fMRI) connectivity analysis through more realistic characterization of distributional assumptions. In simulation, the advantages of such modern methods are readily demonstrable. However quantitative empirical validation remains elusive in vivo as the true connectivity patterns are unknown and noise/artifact distributions are challenging to characterize with high fidelity. Recent innovations in capturing finite sample behavior of asymptotically consistent estimators (i.e., SIMulation and EXtrapolation - SIMEX) have enabled direct estimation of bias given single datasets. Herein, we leverage the theoretical core of SIMEX to study the properties of inference methods in the face of diminishing data (in contrast to increasing noise). The stability of inference methods with respect to synthetic loss of empirical data (defined as resilience) is used to quantify the empirical performance of one inference method relative to another. We illustrate this new approach in a comparison of ordinary and robust inference methods with rs-fMRI.

  8. Preprocessing of Edge of Light images: towards a quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Liu, Zheng; Forsyth, David S.; Marincak, Anton

    2003-08-01

    A computer vision inspection system, named Edge of Light TM (EOL), was invented and developed at the Institute for Aerospace Research of the National Research Council Canada. One application of interest is the detection and quantitative measurement of "pillowing" caused by corrosion in the faying surfaces of aircraft fuselage joints. To quantify the hidden corrosion, one approach is to relate the average corrosion of a region to the peak-to-peak amplitude between two diagonally adjacent rivet centers. This raises the requirement for automatically locating the rivet centers. The first step to achieve this is the rivet edge detection. In this study, gradient-based edge detection, local energy based feature extraction, and an adaptive threshold method were employed to identify the edge of rivets, which facilitated the first step in the EOL quantification procedure. Furthermore, the brightness profile is processed by the derivative operation, which locates the pillowing along the scanning direction. The derivative curves present an estimation of the inspected surface.

  9. Quantitative risk assessment of human salmonellosis in Canadian broiler chicken breast from retail to consumption.

    PubMed

    Smadi, Hanan; Sargeant, Jan M

    2013-02-01

    The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.

  10. Quantitative Deficits of Preschool Children at Risk for Mathematical Learning Disability

    PubMed Central

    Chu, Felicia W.; vanMarle, Kristy; Geary, David C.

    2013-01-01

    The study tested the hypothesis that acuity of the potentially inherent approximate number system (ANS) contributes to risk of mathematical learning disability (MLD). Sixty-eight (35 boys) preschoolers at risk for school failure were assessed on a battery of quantitative tasks, and on intelligence, executive control, preliteracy skills, and parental education. Mathematics achievement scores at the end of 1 year of preschool indicated that 34 of these children were at high risk for MLD. Relative to the 34 typically achieving children, the at risk children were less accurate on the ANS task, and a one standard deviation deficit on this task resulted in a 2.4-fold increase in the odds of MLD status. The at risk children also had a poor understanding of ordinal relations, and had slower learning of Arabic numerals, number words, and their cardinal values. Poor performance on these tasks resulted in 3.6- to 4.5-fold increases in the odds of MLD status. The results provide some support for the ANS hypothesis but also suggest these deficits are not the primary source of poor mathematics learning. PMID:23720643

  11. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  12. Quantitative EEG during normal aging: association with the Alzheimer's disease genetic risk variant in PICALM gene.

    PubMed

    Ponomareva, Natalya V; Andreeva, Tatiana V; Protasova, Maria S; Shagam, Lef I; Malina, Daria D; Goltsov, Andrey Yu; Fokin, Vitaly F; Illarioshkin, Sergey N; Rogaev, Evgeny I

    2017-03-01

    Genome-wide association studies have identified novel risk variants for Alzheimer's disease (AD). Among these, a gene carrying one of the highest risks for AD is PICALM. The PICALM rs3851179 A allele is thought to have a protective effect, whereas the G allele appears to confer risk for AD. The influence of the PICALM genotype on brain function in nondemented subjects remains largely unknown. We examined the possible effect of the PICALM rs3851179 genotype on quantitative electroencephalography recording at rest in 137 nondemented volunteers (age range: 20-79 years) subdivided into cohorts of those younger than and those older than 50 years of age. The homozygous presence of the AD risk variant PICALM GG was associated with an increase in beta relative power, with the effect being more pronounced in the older cohort. Beta power elevation in resting-state electroencephalography has previously been linked to cortical disinhibition and hyperexcitability. The increase in beta relative power in the carriers of the AD risk PICALM GG genotype suggests changes in the cortical excitatory-inhibitory balance, which are heightened during normal aging.

  13. Quantitative deficits of preschool children at risk for mathematical learning disability.

    PubMed

    Chu, Felicia W; Vanmarle, Kristy; Geary, David C

    2013-01-01

    The study tested the hypothesis that acuity of the potentially inherent approximate number system (ANS) contributes to risk of mathematical learning disability (MLD). Sixty-eight (35 boys) preschoolers at risk for school failure were assessed on a battery of quantitative tasks, and on intelligence, executive control, preliteracy skills, and parental education. Mathematics achievement scores at the end of 1 year of preschool indicated that 34 of these children were at high risk for MLD. Relative to the 34 typically achieving children, the at risk children were less accurate on the ANS task, and a one standard deviation deficit on this task resulted in a 2.4-fold increase in the odds of MLD status. The at risk children also had a poor understanding of ordinal relations, and had slower learning of Arabic numerals, number words, and their cardinal values. Poor performance on these tasks resulted in 3.6- to 4.5-fold increases in the odds of MLD status. The results provide some support for the ANS hypothesis but also suggest these deficits are not the primary source of poor mathematics learning.

  14. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  15. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  16. RISK MANAGEMENT EVALUATION FOR CONCENTRATED ANIMAL FEEDING OPERATIONS

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) developed a Risk Management Evaluation (RME) to provide information needed to help plan future research in the Laboratory dealing with the environmental impact of concentrated animal feeding operations (CAFOs). Agriculture...

  17. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  18. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    NASA Astrophysics Data System (ADS)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  19. Quantitative Evaluation of Papilledema from Stereoscopic Color Fundus Photographs

    PubMed Central

    Tang, Li; Kardon, Randy H.; Wang, Jui-Kai; Garvin, Mona K.; Lee, Kyungmoo; Abràmoff, Michael D.

    2012-01-01

    Purpose. To derive a computerized measurement of optic disc volume from digital stereoscopic fundus photographs for the purpose of diagnosing and managing papilledema. Methods. Twenty-nine pairs of stereoscopic fundus photographs and optic nerve head (ONH) centered spectral domain optical coherence tomography (SD-OCT) scans were obtained at the same visit in 15 patients with papilledema. Some patients were imaged at multiple visits in order to assess their changes. Three-dimensional shape of the ONH was estimated from stereo fundus photographs using an automated multi-scale stereo correspondence algorithm. We assessed the correlation of the stereo volume measurements with the SD-OCT volume measurements quantitatively, in terms of volume of retinal surface elevation above a reference plane and also to expert grading of papilledema from digital fundus photographs using the Frisén grading scale. Results. The volumetric measurements of retinal surface elevation estimated from stereo fundus photographs and OCT scans were positively correlated (correlation coefficient r2 = 0.60; P < 0.001) and were positively correlated with Frisén grade (Spearman correlation coefficient r = 0.59; P < 0.001). Conclusions. Retinal surface elevation among papilledema patients obtained from stereo fundus photographs compares favorably with that from OCT scans and with expert grading of papilledema severity. Stereoscopic color imaging of the ONH combined with a method of automated shape reconstruction is a low-cost alternative to SD-OCT scans that has potential for a more cost-effective diagnosis and management of papilledema in a telemedical setting. An automated three-dimensional image analysis method was validated that quantifies the retinal surface topography with an imaging modality that has lacked prior objective assessment. PMID:22661468

  20. Quantitative evaluation of the major determinants of human gait.

    PubMed

    Lin, Yi-Chung; Gfoehler, Margit; Pandy, Marcus G

    2014-04-11

    Accurate knowledge of the isolated contributions of joint movements to the three-dimensional displacement of the center of mass (COM) is fundamental for understanding the kinematics of normal walking and for improving the treatment of gait disabilities. Saunders et al. (1953) identified six kinematic mechanisms to explain the efficient progression of the whole-body COM in the sagittal, transverse, and coronal planes. These mechanisms, referred to as the major determinants of gait, were pelvic rotation, pelvic list, stance knee flexion, foot and knee mechanisms, and hip adduction. The aim of the present study was to quantitatively assess the contribution of each major gait determinant to the anteroposterior, vertical, and mediolateral displacements of the COM over one gait cycle. The contribution of each gait determinant was found by applying the concept of an 'influence coefficient', wherein the partial derivative of the COM displacement with respect to a prescribed determinant was calculated. The analysis was based on three-dimensional measurements of joint angular displacements obtained from 23 healthy young adults walking at slow, normal and fast speeds. We found that hip flexion, stance knee flexion, and ankle-foot interaction (comprised of ankle plantarflexion, toe flexion and the displacement of the center of pressure) are the major determinants of the displacements of the COM in the sagittal plane, while hip adduction and pelvic list contribute most significantly to the mediolateral displacement of the COM in the coronal plane. Pelvic rotation and pelvic list contribute little to the vertical displacement of the COM at all walking speeds. Pelvic tilt, hip rotation, subtalar inversion, and back extension, abduction and rotation make negligible contributions to the displacements of the COM in all three anatomical planes.

  1. Skin moisturization by hydrogenated polyisobutene--quantitative and visual evaluation.

    PubMed

    Dayan, Nava; Sivalenka, Rajarajeswari; Chase, John

    2009-01-01

    Hydrogenated polyisobutene (HP) is used in topically applied cosmetic/personal care formulations as an emollient that leaves a pleasing skin feel when applied, and rubbed in after application. This effect, although distinguishable to the user, is difficult to define and quantify. Recognizing that some of the physical properties of HP such as film formation and wear resistance may contribute, in certain mechanisms, to skin moisturization, we designed a short-term pilot study to follow changes in skin moisturization. HP's incorporation into an o/w emulsion at 8% yielded increased viscosity and reduced emulsion droplet size as compared to the emollient ester CCT (capric/caprylic triglyceride) or a control formulation. Quantitative data indicate that application of the o/w emulsion formulation containing either HP or CCT significantly elevated skin moisture content and thus reduced transepidermal water loss (TEWL) by a maximal approximately 33% against the control formulation within 3 h and maintained this up to 6 h. Visual observation of skin treated with the HP-containing formulation showed fine texture and clear contrast as compared to the control or the CCT formulation, confirming this effect. As a result of increased hydration, skin conductivity, as measured in terms of corneometer values, was also elevated significantly by about tenfold as early as 20 min after HP or CCT application and was maintained throughout the test period. Throughout the test period the HP formulation was 5-10% more effective than the CCT formulation both in reduction of TEWL as well as in increased skin conductivity. Thus, compared to the emollient ester (CCT), HP showed a unique capability for long-lasting effect in retaining moisture and improving skin texture.

  2. Quantitative evaluation of material degradation by Barkhausen noise method

    SciTech Connect

    Yamaguchi, Atsunori; Maeda, Noriyoshi; Sugibayashi, Takuya

    1995-12-01

    Evaluation the life of nuclear power plant becomes inevitable to extend the plant operating period. This paper applied the magnetic method using Barkhausen noise (BHN) to detect the degradation by fatigue and thermal aging. Low alloy steel (SA 508 cl.2) was fatigued at the strain amplitudes of {+-}1% and {+-}0.4%, and duplex stainless steel (SCS14A) was heated at 400 C for a long period (thermal aging). For the degraded material by thermal aging, BHN was measured and good correlation between magnetic properties and absorption energy of the material was obtained. For fatigued material, BHNM was measured at each predetermined cycle and the effect of stress or strain of the material when it measured was evaluated, and good correlation between BHN and fatigue damage ratio was obtained.

  3. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  4. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  5. An evaluation of protein assays for quantitative determination of drugs.

    PubMed

    Williams, Katherine M; Arthur, Sarah J; Burrell, Gillian; Kelly, Fionnuala; Phillips, Darren W; Marshall, Thomas

    2003-07-31

    We have evaluated the response of six protein assays [the biuret, Lowry, bicinchoninic acid (BCA), Coomassie Brilliant Blue (CBB), Pyrogallol Red-Molybdate (PRM), and benzethonium chloride (BEC)] to 21 pharmaceutical drugs. The drugs evaluated were analgesics (acetaminophen, aspirin, codeine, methadone, morphine and pethidine), antibiotics (amoxicillin, ampicillin, gentamicin, neomycin, penicillin G and vancomycin), antipsychotics (chlorpromazine, fluphenazine, prochlorperazine, promazine and thioridazine) and water-soluble vitamins (ascorbic acid, niacinamide, pantothenic acid and pyridoxine). The biuret, Lowry and BCA assays responded strongly to most of the drugs tested. The PRM assay gave a sensitive response to the aminoglycoside antibiotics (gentamicin and neomycin) and the antipsychotic drugs. In contrast, the CBB assay showed little response to the aminoglycosides and gave a relatively poor response with the antipsychotics. The BEC assay did not respond significantly to the drugs tested. The response of the protein assays to the drugs was further evaluated by investigating the linearity of the response and the combined response of drug plus protein. The results are discussed with reference to drug interference in protein assays and the development of new methods for the quantification of drugs in protein-free solution.

  6. At-Risk Youth Appearance and Job Performance Evaluation

    ERIC Educational Resources Information Center

    Freeburg, Beth Winfrey; Workman, Jane E.

    2008-01-01

    The goal of this study was to identify the relationship of at-risk youth workplace appearance to other job performance criteria. Employers (n = 30; each employing from 1 to 17 youths) evaluated 178 at-risk high school youths who completed a paid summer employment experience. Appearance evaluations were significantly correlated with evaluations of…

  7. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  8. Evaluation of digital dermoscopy in a pigmented lesion clinic: clinician versus computer assessment of malignancy risk.

    PubMed

    Boldrick, Jennifer C; Layton, Christle J; Nguyen, Josephine; Swetter, Susan M

    2007-03-01

    Digital dermoscopy systems employ computer-based algorithms to quantitate features of pigmented skin lesions (PSLs) and provide an assessment of malignancy risk. We evaluated interobserver concordance of PSL malignancy risk between a pigmented lesion specialist and an artificial neural network (ANN)-based automated digital dermoscopy system. While digital dermoscopy provides a reliable means of image capture, storage, and comparison of PSLs over time, the ANN algorithm requires further training and validation before the malignancy risk assessment feature can be widely used in clinical practice.

  9. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  10. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  11. EVALUATING QUANTITATIVE FORMULAS FOR DOSE-RESPONSE ASSESSMENT OF CHEMICAL MIXTURES

    EPA Science Inventory

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment d...

  12. Risk Evaluation of Endocrine-Disrupting Chemicals

    PubMed Central

    Gioiosa, Laura; Palanza, Paola; vom Saal, Frederick S.

    2015-01-01

    We review here our studies on early exposure to low doses of the estrogenic endocrine-disrupting chemical bisphenol A (BPA) on behavior and metabolism in CD-1 mice. Mice were exposed in utero from gestation day (GD) 11 to delivery (prenatal exposure) or via maternal milk from birth to postnatal day 7 (postnatal exposure) to 10 µg/kg body weight/d of BPA or no BPA (controls). Bisphenol A exposure resulted in long-term disruption of sexually dimorphic behaviors. Females exposed to BPA pre- and postnatally showed increased anxiety and behavioral profiles similar to control males. We also evaluated metabolic effects in prenatally exposed adult male offspring of dams fed (from GD 9 to 18) with BPA at doses ranging from 5 to 50 000 µg/kg/d. The males showed an age-related significant change in a number of metabolic indexes ranging from food intake to glucose regulation at BPA doses below the no observed adverse effect level (5000 µg/kg/d). Consistent with prior findings, low but not high BPA doses produced significant effects for many outcomes. These findings provide further evidence of the potential risks that developmental exposure to low doses of the endocrine disrupter BPA may pose to human health, with fetuses and infants being highly vulnerable. PMID:26740806

  13. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  14. Distance estimation from acceleration for quantitative evaluation of Parkinson tremor.

    PubMed

    Jeon, Hyoseon; Kim, Sang Kyong; Jeon, BeomSeok; Park, Kwang Suk

    2011-01-01

    The purpose of this paper is to assess Parkinson tremor estimating actual distance amplitude. We propose a practical, useful and simple method for evaluating Parkinson tremor with distance value. We measured resting tremor of 7 Parkinson Disease (PD) patients with triaxial accelerometer. Resting tremor of participants was diagnosed by Unified Parkinson's Disease Rating Scale (UPDRS) by neurologist. First, we segmented acceleration signal during 7 seconds from recorded data. To estimate a displacement of tremor, we performed double integration from the acceleration. Prior to double integration, moving average method was used to reduce an error of integral constant. After estimation of displacement, we calculated tremor distance during 1s from segmented signal using Euclidean distance. We evaluated the distance values compared with UPDRS. Averaged moving distance during 1 second corresponding to UPDRS 1 was 11.52 mm, that of UPDRS 2 was 33.58 mm and tremor distance of UPDRS 3 was 382.22 mm. Estimated moving distance during 1s was proportional to clinical rating scale--UPDRS.

  15. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  16. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  17. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  18. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R; Murphy, Stephen M; Day, Robert H; Bence, A Edward; Neff, Jerry M; Wiens, John A

    2012-03-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001-2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400-4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent.

  19. Gas turbine coatings eddy current quantitative and qualitative evaluation

    NASA Astrophysics Data System (ADS)

    Ribichini, Remo; Giolli, Carlo; Scrinzi, Erica

    2017-02-01

    Gas turbine blades (buckets) are among the most critical and expensive components of the engine. Buckets rely on protective coatings in order to withstand the harsh environment in which they operate. The thickness and the microstructure of coatings during the lifespan of a unit are fundamental to evaluate their fitness for service. A frequency scanning Eddy Current instrument can allow the measurement of the thickness and of physical properties of coatings in a Non-Destructive manner. The method employed relies on the acquisition of impedance spectra and on the inversion of the experimental data to derive the coating properties and structure using some assumptions. This article describes the experimental validation performed on several samples and real components in order to assess the performance of the instrument as a coating thickness gage. The application of the technique to support residual life assessment of serviced buckets is also presented.

  20. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  1. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  2. Quantitative evaluation of solar wind time-shifting methods

    NASA Astrophysics Data System (ADS)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  3. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  4. A Quantitative Risk-Benefit Analysis of Prophylactic Surgery Prior to Extended-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Carroll, Danielle; Reyes, David; Kerstman, Eric; Walton, Marlei; Antonsen, Erik

    2017-01-01

    INTRODUCTION: Among otherwise healthy astronauts undertaking deep space missions, the risks for acute appendicitis (AA) and cholecystitis (AC) are not zero. If these conditions were to occur during spaceflight they may require surgery for definitive care. The proposed study quantifies and compares the risks of developing de novo AA and AC in-flight to the surgical risks of prophylactic laparoscopic appendectomy (LA) and cholecystectomy (LC) using NASA's Integrated Medical Model (IMM). METHODS: The IMM is a Monte Carlo simulation that forecasts medical events during spaceflight missions and estimates the impact of these medical events on crew health. In this study, four Design Reference Missions (DRMs) were created to assess the probability of an astronaut developing in-flight small-bowel obstruction (SBO) following prophylactic 1) LA, 2) LC, 3) LA and LC, or 4) neither surgery (SR# S-20160407-351). Model inputs were drawn from a large, population-based 2011 Swedish study that examined the incidence and risks of post-operative SBO over a 5-year follow-up period. The study group included 1,152 patients who underwent LA, and 16,371 who underwent LC. RESULTS: Preliminary results indicate that prophylactic LA may yield higher mission risks than the control DRM. Complete analyses are pending and will be subsequently available. DISCUSSION: The risk versus benefits of prophylactic surgery in astronauts to decrease the probability of acute surgical events during spaceflight has only been qualitatively examined in prior studies. Within the assumptions and limitations of the IMM, this work provides the first quantitative guidance that has previously been lacking to this important question for future deep space exploration missions.

  5. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  6. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  7. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  8. GWAS implicates a role for quantitative immune traits and threshold effects in risk for human autoimmune disorders.

    PubMed

    Gregersen, Peter K; Diamond, Betty; Plenge, Robert M

    2012-10-01

    Genome wide association studies in human autoimmune disorders have provided a long list of alleles with rather modest degrees of risk. A large fraction of these associations are probably owing to either quantitative differences in gene expression or amino acid changes that regulate quantitative aspects of the immune response. While functional studies are still lacking for most of these associations, we present examples of autoimmune disease risk alleles that influence quantitative changes in lymphocyte activation, cytokine signaling and dendritic cell function. The analysis of immune quantitative traits associated with autoimmune loci is clearly going to be an important component of understanding the pathogenesis of autoimmunity. This will require both new and more efficient ways of characterizing the normal immune system, as well as large population resources in which genotype-phenotype correlations can be convincingly demonstrated. Future development of new therapies will depend on understanding the mechanistic underpinnings of immune regulation by these new risk loci.

  9. A quantitative methodology to assess the risks to human health from CO2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, E.; Sitchler, A.; Maxwell, R. M.; McCray, J. E.

    2010-12-01

    Leakage of CO2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO2 leakage into drinking water aquifers. This framework incorporates the potential release of CO2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding given greater toxicity of lead at lower doses than arsenic. It was also found that higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and

  10. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  11. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages.

    PubMed

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2013-12-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns.

  12. Quantitative evaluation of wrist posture and typing performance: A comparative study of 4 computer keyboards

    SciTech Connect

    Burastero, S.

    1994-05-01

    The present study focuses on an ergonomic evaluation of 4 computer keyboards, based on subjective analyses of operator comfort and on a quantitative analysis of typing performance and wrist posture during typing. The objectives of this study are (1) to quantify differences in the wrist posture and in typing performance when the four different keyboards are used, and (2) to analyze the subjective preferences of the subjects for alternative keyboards compared to the standard flat keyboard with respect to the quantitative measurements.

  13. Evaluating Melanoma Drug Response and Therapeutic Escape with Quantitative Proteomics*

    PubMed Central

    Rebecca, Vito W.; Wood, Elizabeth; Fedorenko, Inna V.; Paraiso, Kim H. T.; Haarberg, H. Eirik; Chen, Yi; Xiang, Yun; Sarnaik, Amod; Gibney, Geoffrey T.; Sondak, Vernon K.; Koomen, John M.; Smalley, Keiran S. M.

    2014-01-01

    The evolution of cancer therapy into complex regimens with multiple drugs requires novel approaches for the development and evaluation of companion biomarkers. Liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) is a versatile platform for biomarker measurement. In this study, we describe the development and use of the LC-MRM platform to study the adaptive signaling responses of melanoma cells to inhibitors of HSP90 (XL888) and MEK (AZD6244). XL888 had good anti-tumor activity against NRAS mutant melanoma cell lines as well as BRAF mutant cells with acquired resistance to BRAF inhibitors both in vitro and in vivo. LC-MRM analysis showed HSP90 inhibition to be associated with decreased expression of multiple receptor tyrosine kinases, modules in the PI3K/AKT/mammalian target of rapamycin pathway, and the MAPK/CDK4 signaling axis in NRAS mutant melanoma cell lines and the inhibition of PI3K/AKT signaling in BRAF mutant melanoma xenografts with acquired vemurafenib resistance. The LC-MRM approach targeting more than 80 cancer signaling proteins was highly sensitive and could be applied to fine needle aspirates from xenografts and clinical melanoma specimens (using 50 μg of total protein). We further showed MEK inhibition to be associated with signaling through the NFκB and WNT signaling pathways, as well as increased receptor tyrosine kinase expression and activation. Validation studies identified PDGF receptor β signaling as a potential escape mechanism from MEK inhibition, which could be overcome through combined use of AZD6244 and the PDGF receptor inhibitor, crenolanib. Together, our studies show LC-MRM to have unique value as a platform for the systems level understanding of the molecular mechanisms of drug response and therapeutic escape. This work provides the proof-of-principle for the future development of LC-MRM assays for monitoring drug responses in the clinic. PMID:24760959

  14. Hygienization by anaerobic digestion: comparison between evaluation by cultivation and quantitative real-time PCR.

    PubMed

    Lebuhn, M; Effenberger, M; Garcés, G; Gronauer, A; Wilderer, P A

    2005-01-01

    In order to assess hygienization by anaerobic digestion, a comparison between evaluation by cultivation and quantitative real-time PCR (qPCR) including optimized DNA extraction and quantification was carried out for samples from a full-scale fermenter cascade (F1, mesophilic; F2, thermophilic; F3, mesophilic). The system was highly effective in inactivating (pathogenic) viable microorganisms, except for spore-formers. Conventionally performed cultivation underestimated viable organisms particularly in F2 and F3 by a factor of at least 10 as shown by data from extended incubation times, probably due to the rise of sublethally injured (active but not cultivable) cells. Incubation should hence be extended adequately in incubation-based hygiene monitoring of stressed samples, in order to minimize contamination risks. Although results from qPCR and cultivation agreed for the equilibrated compartments, considerably higher qPCR values were obtained for the fermenters. The difference probably corresponded to DNA copies from decayed cells that had not yet been degraded by the residual microbial activity. An extrapolation from qPCR determination to the quantity of viable organisms is hence not justified for samples that had been exposed to lethal stress.

  15. EVALUATING TOOLS AND MODELS USED FOR QUANTITATIVE EXTRAPOLATION OF IN VITRO TO IN VIVO DATA FOR NEUROTOXICANTS*

    EPA Science Inventory

    There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...

  16. Interactive graphics for expressing health risks: development and qualitative evaluation.

    PubMed

    Ancker, Jessica S; Chan, Connie; Kukafka, Rita

    2009-01-01

    Recent findings suggest that interactive game-like graphics might be useful in communicating probabilities. We developed a prototype for a risk communication module, focusing on eliciting users' preferences for different interactive graphics and assessing usability and user interpretations. Feedback from five focus groups was used to design the graphics. The final version displayed a matrix of square buttons; clicking on any button allowed the user to see whether the stick figure underneath was affected by the health outcome. When participants used this interaction to learn about a risk, they expressed more emotional responses, both positive and negative, than when viewing any static graphic or numerical description of a risk. Their responses included relief about small risks and concern about large risks. The groups also commented on static graphics: arranging the figures affected by disease randomly throughout a group of figures made it more difficult to judge the proportion affected but often was described as more realistic. Interactive graphics appear to have potential for expressing risk magnitude as well as the feeling of risk. This affective impact could be useful in increasing perceived threat of high risks, calming fears about low risks, or comparing risks. Quantitative studies are planned to assess the effect on perceived risks and estimated risk magnitudes.

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  18. Fatalities in high altitude mountaineering: a review of quantitative risk estimates.

    PubMed

    Weinbruch, Stephan; Nordby, Karl-Christian

    2013-12-01

    Quantitative estimates for mortality in high altitude mountaineering are reviewed. Special emphasis is placed on the heterogeneity of the risk estimates and on confounding. Crude estimates for mortality are on the order of 1/1000 to 40/1000 persons above base camp, for both expedition members and high altitude porters. High altitude porters have mostly a lower risk than expedition members (risk ratio for all Nepalese peaks requiring an expedition permit: 0.73; 95 % confidence interval 0.59-0.89). The summit bid is generally the most dangerous part of an expedition for members, whereas most high altitude porters die during route preparation. On 8000 m peaks, the mortality during descent from summit varies between 4/1000 and 134/1000 summiteers (members plus porters). The risk estimates are confounded by human and environmental factors. Information on confounding by gender and age is contradictory and requires further work. There are indications for safety segregation of men and women, with women being more risk averse than men. Citizenship appears to be a significant confounder. Prior high altitude mountaineering experience in Nepal has no protective effect. Commercial expeditions in the Nepalese Himalayas have a lower mortality than traditional expeditions, though after controlling for confounding, the difference is not statistically significant. The overall mortality is increasing with increasing peak altitude for expedition members but not for high altitude porters. In the Nepalese Himalayas and in Alaska, a significant decrease of mortality with calendar year was observed. A few suggestions for further work are made at the end of the article.

  19. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF RISK MANAGEMENT TECHNIQUES

    EPA Science Inventory

    Often, the performance of risk management techniques is evaluated by measuring the concentrations of the chemials of concern before and after risk management effoprts. However, using bioassays and chemical data provides a more robust understanding of the effectiveness of risk man...

  20. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  1. Quantitative microbial risk assessment models for consumption of raw vegetables irrigated with reclaimed water.

    PubMed

    Hamilton, Andrew J; Stagnitti, Frank; Premier, Robert; Boland, Anne-Maree; Hale, Glenn

    2006-05-01

    Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10(-3) to 10(-1) when reclaimed-water irrigation ceased 1 day before harvest and from 10(-9) to 10(-3) when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of < or =10(-4), i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10(-4) standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food

  2. Quantitative Microbial Risk Assessment Models for Consumption of Raw Vegetables Irrigated with Reclaimed Water

    PubMed Central

    Hamilton, Andrew J.; Stagnitti, Frank; Premier, Robert; Boland, Anne-Maree; Hale, Glenn

    2006-01-01

    Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10−3 to 10−1 when reclaimed-water irrigation ceased 1 day before harvest and from 10−9 to 10−3 when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of ≤10−4, i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10−4 standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food preparation

  3. Assessment of Semi-Quantitative Health Risks of Exposure to Harmful Chemical Agents in the Context of Carcinogenesis in the Latex Glove Manufacturing Industry.

    PubMed

    Yari, Saeed; Fallah Asadi, Ayda; Varmazyar, Sakineh

    2016-01-01

    Excessive exposure to chemicals in the workplace can cause poisoning and various diseases. Thus, for the protection of labor, it is necessary to examine the exposure of people to chemicals and risks from these materials. The purpose of this study is to evaluate semi-quantitative health risks of exposure to harmful chemical agents in the context of carcinogenesis in a latex glove manufacturing industry. In this cross-sectional study, semi-quantitative risk assessment methods provided by the Department of Occupational Health of Singapore were used and index of LD50, carcinogenesis (ACGIH and IARC) and corrosion capacity were applied to calculate the hazard rate and the biggest index was placed as the basis of risk. To calculate the exposure rate, two exposure index methods and the actual level of exposure were employed. After identifying risks, group H (high) and E (very high) classified as high-risk were considered. Of the total of 271 only 39 (15%) were at a high risk level and 3% were very high (E). These risks only was relevant to 7 materials with only sulfuric acid placed in group E and 6 other materials in group H, including nitric acid (48.3%), chromic acid (6.9%), hydrochloric acid (10.3%), ammonia (3.4%), potassium hydroxide (20.7%) and chlorine (10.3%). Overall, the average hazard rate level was estimated to be 4 and average exposure rate to be 3.5. Health risks identified in this study showed that the manufacturing industry for latex gloves has a high level of risk because of carcinogens, acids and strong alkalisand dangerous drugs. Also according to the average level of risk impact, it is better that the safety design strategy for latex gloves production industry be placed on the agenda.

  4. Risk in Enterprise Cloud Computing: Re-Evaluated

    ERIC Educational Resources Information Center

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  5. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    fractured reservoir, fracture propagation, fault zones and their role in regard to fluid migration into shallow aquifers). A quantitative risk assessment which should be the main aim of future work in this field has much higher demands, especially on site specific data, as the estimation of statistical parameter uncertainty requires site specific parameter distributions. There is already ongoing research on risk assessment in related fields like CO2 sequestration. We therefore propose these methodologies to be transferred to risk estimation relating to the use of the hydraulic fracking method, be it for unconventional gas or enhanced geothermal energy production. The overall aim should be to set common and transparent standards for different uses of the subsurface and their involved risks and communicate those to policy makers and stake holders.

  6. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    PubMed

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk.

  7. Evaluation of the field relevance of several injury risk functions.

    PubMed

    Prasad, Priya; Mertz, Harold J; Dalmotas, Danius J; Augenstein, Jeffrey S; Diggs, Kennerly

    2010-11-01

    An evaluation of the four injury risk curves proposed in the NHTSA NCAP for estimating the risk of AIS>= 3 injuries to the head, neck, chest and AIS>=2 injury to the Knee-Thigh-Hip (KTH) complex has been conducted. The predicted injury risk to the four body regions based on driver dummy responses in over 300 frontal NCAP tests were compared against those to drivers involved in real-world crashes of similar severity as represented in the NASS. The results of the study show that the predicted injury risks to the head and chest were slightly below those in NASS, and the predicted risk for the knee-thigh-hip complex was substantially below that observed in the NASS. The predicted risk for the neck by the Nij curve was greater than the observed risk in NASS by an order of magnitude due to the Nij risk curve predicting a non-zero risk when Nij = 0. An alternative and published Nte risk curve produced a risk estimate consistent with the NASS estimate of neck injury. Similarly, an alternative and published chest injury risk curve produced a risk estimate that was within the bounds of the NASS estimates. No published risk curve for femur compressive load could be found that would give risk estimates consistent with the range of the NASS estimates. Additional work on developing a femur compressive load risk curve is recommended.

  8. Evaluation of bone metabolism in newborn twins using quantitative ultrasound and biochemical parameters.

    PubMed

    Kara, Semra; Güzoğlu, Nilüfer; Göçer, Emine; Arıkan, Fatma Inci; Dilmen, Uğur; Dallar Bilge, Yıldız

    2016-03-01

    Metabolic bone disease (MBD) is one of the important complications of prematurity. Early and adequate nutritional interventions may reduce the incidence and potential complications of MBD. The present study aimed to evaluate bone metabolism in twins via biochemical parameters and quantitative ultrasound (QUS) and to compare the results between twin pairs. Moreover, twin infants were evaluated in terms of potential risk factors likely to have impact on MBD. Forty-three pairs of twins were included in the study. Serum calcium, phosphorus, magnesium, and alkaline phosphatase concentrations were assessed and bone mineral density was measured using QUS (speed of sound, SOS) at postnatal 30 d. Co-twin with the higher birth weight was assigned to Group 1 (n = 36) and the other twin was assigned to Group 2 (n = 36). Birth weight and head circumference were significantly higher in the infants of Group 1 compared with Group 2. No significant difference was found among the groups in terms of gender, history of resuscitation, length of stay in intensive care unit (ICU) or in the incubator, duration of total parenteral nutrition (TPN), type of nutrition, vitamin D use, biochemical parameters, and the SOS value. The factors likely to affect SOS, including type of pregnancy, maternal drug use, gender of infant, birth weight, head circumference at birth, gestational week, length of stay at the ICU, duration of TPN, type of nutrition, resuscitation, vitamin D use, and levels of calcium, phosphorus, magnesium, and alkaline phosphatase were entered into the model. The phosphorus level and the maternal drug use were found to be the factors that significantly reduced SOS, whereas pregnancy after assisted reproductive techniques was found to be a significant enhancing factor.

  9. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-01-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  10. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-04-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks.

  11. Farm to Fork Quantitative Risk Assessment of Listeria monocytogenes Contamination in Raw and Pasteurized Milk Cheese in Ireland.

    PubMed

    Tiwari, Uma; Cummins, Enda; Valero, Antonio; Walsh, Des; Dalmasso, Marion; Jordan, Kieran; Duffy, Geraldine

    2015-06-01

    The objective of this study was to model and quantify the level of Listeria monocytogenes in raw milk cheese (RMc) and pasteurized milk cheese (PMc) from farm to fork using a Bayesian inference approach combined with a quantitative risk assessment. The modeling approach included a prediction of contamination arising from the farm environment as well from cross-contamination within the cheese-processing facility through storage and subsequent human exposure. The model predicted a high concentration of L. monocytogenes in contaminated RMc (mean 2.19 log10 CFU/g) compared to PMc (mean -1.73 log10 CFU/g). The mean probability of illness (P1 for low-risk population, LR) and (P2 for high-risk population, HR, e.g., immunocompromised) adult Irish consumers following exposure to contaminated cheese was 7 × 10(-8) (P1 ) and 9 × 10(-4) (P2 ) for RMc and 7 × 10(-10) (P1 ) and 8 × 10(-6) (P2 ) for PMc, respectively. In addition, the model was used to evaluate performance objectives at various stages, namely, the cheese making and ripening stages, and to set a food safety objective at the time of consumption. A scenario analysis predicted various probabilities of L. monocytogenes contamination along the cheese-processing chain for both RMc and PMc. The sensitivity analysis showed the critical factors for both cheeses were the serving size of the cheese, storage time, and temperature at the distribution stage. The developed model will allow food processors and policymakers to identify the possible routes of contamination along the cheese-processing chain and to reduce the risk posed to human health.

  12. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  13. Dynamic quantitative echocardiographic evaluation of mitral regurgitation in the operating department.

    PubMed

    Gisbert, Alejandro; Soulière, Vicky; Denault, André Y; Bouchard, Denis; Couture, Pierre; Pellerin, Michel; Carrier, Michel; Levesque, Sylvie; Ducharme, Anique; Basmadjian, Arsène J

    2006-02-01

    Hemodynamic modifications induced by general anesthesia could lead to underestimation of mitral regurgitation (MR) severity in the operating department and potentially serious consequences. The intraoperative severity of MR was prospectively compared with the preoperative baseline evaluation using dynamic quantitative transesophageal echocardiography in 25 patients who were stable with MR 2/4 or greater undergoing coronary bypass, mitral valve operation, or both. Significant changes in the severity of MR using transesophageal echocardiographic criteria occurred after the induction of general anesthesia and with phenylephrine. Quantitative transesophageal echocardiographic evaluation of MR using effective orifice area and vena contracta, and the use of phenylephrine challenge, were useful to avoid underestimating MR severity in the operating department.

  14. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  15. Risk Quantified Structural Design and Evaluation

    DTIC Science & Technology

    2009-09-01

    Other industries, such as nuclear power and offshore oil , have developed a risk quantification process for deciding if a given structure should be...airworthiness of an aircraft. Other industries, such as nuclear power and offshore oil , have developed a risk quantification process for deciding if...Other industries, such as nuclear power and offshore oil , offer alternatives to the current build- and-test paradigm. In both industries

  16. Quantitative evaluation on the performance and feature enhancement of stochastic resonance for bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Li, Guoying; Li, Jimeng; Wang, Shibin; Chen, Xuefeng

    2016-12-01

    Stochastic resonance (SR) has been widely applied in the field of weak signal detection by virtue of its characteristic of utilizing noise to amplify useful signal instead of eliminating noise in nonlinear dynamical systems. How to quantitatively evaluate the performance of SR, including the enhancement effect and the degree of waveform distortion, and how to accurately extract signal amplitude have become two important issues in the research on SR. In this paper, the signal-to-noise ratio (SNR) of the main component to the residual in the SR output is constructed to quantitatively measure the enhancement effect of the SR method. And two indices are constructed to quantitatively measure the degree of waveform distortion of the SR output, including the correlation coefficient between the main component in the SR output and the original signal, and the zero-crossing ratio. These quantitative indices are combined to provide a comprehensive quantitative index for adaptive parameter selection of the SR method, and eventually the adaptive SR method can be effective in enhancing the weak component hidden in the original signal. Fast Fourier Transform and Fourier Transform (FFT+FT) spectrum correction technology can extract the signal amplitude from the original signal and effectively reduce the difficulty of extracting signal amplitude from the distorted resonance output. The application in vibration analysis for bearing fault diagnosis verifies that the proposed quantitative evaluation method for adaptive SR can effectively detect weak fault feature of the vibration signal during the incipient stage of bearing fault.

  17. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis.

  18. Quantitative evaluation of regional vegetation ecological environment quality by using remotely sensed data over Qingjiang, Hubei

    NASA Astrophysics Data System (ADS)

    Wang, Cheng; Sun, Yan; Li, Lijun; Zhang, Qiuwen

    2007-11-01

    Vegetation cover is an important component and the best indication to the region ecological environment. The paper adopts a new method of integrating remote sensing technology and composite index appraisal model based multiple linear regression for quantitatively evaluating the regional vegetation ecological environment quality(VEEQ). This method is different to the traditional ecological environment research methods. It fully utilizes the advantages of quantitatively remote sensing technology, directly extracts the key influencing factors of VEEQ, such as vegetation indices (RVI, NDVI, ARVI, TMG), humidity indices(NDMI, MI, TMW), soil and landform indices(NDSI, TMB, GRABS) as the evaluating parameters from data the Landsat 5/TM remotely sensed images, and then puts these factors mentioned above into the multiple linear regression evaluating model. Ultimately we obtain the VEEQ evaluation rank figure of the experimental field-part of Qingjiang region. The handy multiple linear regression model, is proved to be well fit the experimental field for the vegetation ecological environment evaluation research.

  19. Quantitative experimental determination of primer-dimer formation risk by free-solution conjugate electrophoresis.

    PubMed

    Desmarais, Samantha M; Leitner, Thomas; Barron, Annelise E

    2012-02-01

    DNA barcodes are short, unique ssDNA primers that "mark" individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 base-pairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive base-pairs formed, yet non-consecutive base-pairs did not create stable dimers even when 20 out of 30 possible base-pairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation.

  20. Approaches for assessing risks to sensitive populations: Lessons learned from evaluating risks in the pediatric populations*

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a 'tool chest' of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of...

  1. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a "tool chest" of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of t...

  2. Evaluation of effect of different membership functions on risk assessment.

    PubMed

    Atalay, Kumru Didem; Can, Gülin Feryal; Eraslan, Ergün

    2017-03-23

    This study aims to define the relationship between risk degrees and risk indexes on different functional structures with the assumption that, risk degrees may not always present linear relationship with the risk indexes. In this wise, risk indexes; suitable for expert evaluation of working conditions and computed by using three different membership functions are determined. Among the membership functions used, one of them is preferred as linear and the others are preferred as nonlinear. Additionally, a new fuzzy risk assessment (RA) algorithm is developed by using these three membership functions. With this new fuzzy RA algorithm, a more flexible and precise process becomes available, while information loss during the determination of the risk index of danger sources is prevented. As a result, nonlinear increasing membership function is selected as the most suitable one for the expression of relationship between risk degrees and risk indexes.

  3. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  4. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  5. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  6. Retrospective analysis of a listeria monocytogenes contamination episode in raw milk goat cheese using quantitative microbial risk assessment tools.

    PubMed

    Delhalle, L; Ellouze, M; Yde, M; Clinquart, A; Daube, G; Korsak, N

    2012-12-01

    In 2005, the Belgian authorities reported a Listeria monocytogenes contamination episode in cheese made from raw goat's milk. The presence of an asymptomatic shedder goat in the herd caused this contamination. On the basis of data collected at the time of the episode, a retrospective study was performed using an exposure assessment model covering the production chain from the milking of goats up to delivery of cheese to the market. Predictive microbiology models were used to simulate the growth of L. monocytogenes during the cheese process in relation with temperature, pH, and water activity. The model showed significant growth of L. monocytogenes during chilling and storage of the milk collected the day before the cheese production (median increase of 2.2 log CFU/ml) and during the addition of starter and rennet to milk (median increase of 1.2 log CFU/ml). The L. monocytogenes concentration in the fresh unripened cheese was estimated to be 3.8 log CFU/g (median). This result is consistent with the number of L. monocytogenes in the fresh cheese (3.6 log CFU/g) reported during the cheese contamination episode. A variance-based method sensitivity analysis identified the most important factors impacting the cheese contamination, and a scenario analysis then evaluated several options for risk mitigation. Thus, by using quantitative microbial risk assessment tools, this study provides reliable information to identify and control critical steps in a local production chain of cheese made from raw goat's milk.

  7. Evaluating systems for oxygen service through the use of quantitative fault tree analysis

    NASA Astrophysics Data System (ADS)

    Santay, Anthony J.

    In the event of a process plant upset, systems not normally intended for use in oxygen service may be suddenly subject to an oxygen-enriched atmosphere. If the upset condition occurs frequently, a conservative approach would be to design all components as if they were normally in oxygen service. As an alternative, one could calculate the probability of the upset condition to quantitatively assess the risk and recommend corrective measures to further reduce the risk. Quantified fault tree techniques are used to determine a system's compatibility when exposed to oxygen in this manner.

  8. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  9. Quantitative risk assessment for lung cancer from exposure to metal ore dust

    SciTech Connect

    Fu, H.; Jing, X.; Yu, S.; Gu, X.; Wu, K.; Yang, J.; Qiu, S. )

    1992-09-01

    To quantitatively assess risk for lung cancer of metal miners, a historical cohort study was conducted. The cohort consisted of 1113 miners who were employed to underground work for at least 12 months between January 1, 1960 and December 12, 1974. According to the records of dust concentration, a cumulative dust dose of each miner in the cohort was estimated. There were 162 deaths in total and 45 deaths from lung cancer with a SMR of 2184. The SMR for lung cancer increased from 1019 for those with cumulative dust dose of less than 500 mg-year to 2469 for those with the dose of greater than 4500 mg-year. Furthermore, the risk in the highest category of combined cumulative dust dose and cigarette smoking was 46-fold greater than the lowest category of dust dose and smoking. This study showed that there was an exposure-response relationship between metal ore dust and lung cancer, and an interaction of lung cancer between smoking and metal ore dust exposure.

  10. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  11. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  12. Toward Web-Site Quantitative Evaluation: Defining Quality Characteristics and Attributes.

    ERIC Educational Resources Information Center

    Olsina, L; Rossi, G.

    This paper identifies World Wide Web site characteristics and attributes and groups them in a hierarchy. The primary goal is to classify the elements that might be part of a quantitative evaluation and comparison process. In order to effectively select quality characteristics, different users' needs and behaviors are considered. Following an…

  13. Quantitative Skills as a Graduate Learning Outcome: Exploring Students' Evaluative Expertise

    ERIC Educational Resources Information Center

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2017-01-01

    In the biosciences, quantitative skills are an essential graduate learning outcome. Efforts to evidence student attainment at the whole of degree programme level are rare and making sense of such data is complex. We draw on assessment theories from Sadler (evaluative expertise) and Boud (sustainable assessment) to interpret final-year bioscience…

  14. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  15. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  16. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  17. Complementarity as a Program Evaluation Strategy: A Focus on Qualitative and Quantitative Methods.

    ERIC Educational Resources Information Center

    Lafleur, Clay

    Use of complementarity as a deliberate and necessary program evaluation strategy is discussed. Quantitative and qualitative approaches are viewed as complementary and can be integrated into a single study. The synergy that results from using complementary methods in a single study seems to enhance understanding and interpretation. A review of the…

  18. Evaluation of a quantitative phosphorus transport model for potential improvement of southern phosphorus indices

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Due to a shortage of available phosphorus (P) loss data sets, simulated data from a quantitative P transport model could be used to evaluate a P-index. However, the model would need to accurately predict the P loss data sets that are available. The objective of this study was to compare predictions ...

  19. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  20. Integrating Qualitative Methods in a Predominantly Quantitative Evaluation: A Case Study and Some Reflections.

    ERIC Educational Resources Information Center

    Mark, Melvin M.; Feller, Irwin; Button, Scott B.

    1997-01-01

    A review of qualitative methods used in a predominantly quantitative evaluation indicates a variety of roles for such a mixing of methods, including framing and revising research questions, assessing the validity of measures and adaptations to program implementation, and gauging the degree of uncertainty and generalizability of conclusions.…

  1. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR

    PubMed Central

    Abt, Melissa A.; Grek, Christina L.; Ghatnekar, Gautam S.; Yeh, Elizabeth S.

    2016-01-01

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death 1. Common sites of metastatic spread include lung, lymph node, brain, and bone 2. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level 3–6. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level 7, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue. PMID:26862835

  2. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR.

    PubMed

    Abt, Melissa A; Grek, Christina L; Ghatnekar, Gautam S; Yeh, Elizabeth S

    2016-01-29

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death. Common sites of metastatic spread include lung, lymph node, brain, and bone. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue.

  3. Quantitative ecological risk assessment of inhabitants exposed to polycyclic aromatic hydrocarbons in terrestrial soils of King George Island, Antarctica

    NASA Astrophysics Data System (ADS)

    Pongpiachan, S.; Hattayanone, M.; Pinyakong, O.; Viyakarn, V.; Chavanich, S. A.; Bo, C.; Khumsup, C.; Kittikoon, I.; Hirunyatrakul, P.

    2017-03-01

    This study aims to conduct a quantitative ecological risk assessment of human exposure to polycyclic aromatic hydrocarbons (PAHs) in terrestrial soils of King George Island, Antarctica. Generally, the average PAH concentrations detected in King George Terrestrial Soils (KGS) were appreciably lower than those of World Marine Sediments (WMS) and World Terrestrial Soils (WTS), highlighting the fact that Antarctica is one of the most pristine continents in the world. The total concentrations of twelve probably carcinogenic PAHs (ΣPAHs: a sum of Phe, An, Fluo, Pyr, B[a]A, Chry, B[b]F, B[k]F, B[a]P, Ind, D[a,h]A and B[g,h,i]P) were 3.21 ± 1.62 ng g-1, 5749 ± 4576 ng g-1, and 257,496 ± 291,268 ng g-1, for KGS, WMS and WTS, respectively. In spite of the fact that KGS has extremely low ΣPAHs in comparison with others, the percentage contribution of Phe is exceedingly high with the value of 50%. By assuming that incidental ingestion and dermal contact are two major exposure pathways responsible for the adverse human health effects, the cancer and non-cancer risks from environmental exposure to PAHs were carefully evaluated based on the ;Role of the Baseline Risk Assessment in Superfund Remedy Selection Decisions; memorandum provided by US-EPA. The logarithms of cancer risk levels of PAH contents in KGS varied from -11.1 to -7.18 with an average of -7.96 ± 7.73, which is 1790 times and 80,176 times lower than that of WMS and WTS, respectively. All cancer risk levels of PAH concentrations observed in KGS are significantly (p < 0.001) lower than those of WMS and WTS. Despite the Comandante Ferraz Antarctic Station fire occurred in February 25th, 2012, both the cancer and non-cancer risks of environmental exposure to PAHs were found in ;acceptable level;.

  4. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration.

  5. Comparison of recreational health risks associated with surfing and swimming in dry weather and post-storm conditions at Southern California beaches using quantitative microbial risk assessment (QMRA).

    PubMed

    Tseng, Linda Y; Jiang, Sunny C

    2012-05-01

    Southern California is an increasingly urbanized hotspot for surfing, thus it is of great interest to assess the human illness risks associated with this popular ocean recreational water sport from exposure to fecal bacteria contaminated coastal waters. Quantitative microbial risk assessments were applied to eight popular Southern California beaches using readily available enterococcus and fecal coliform data and dose-response models to compare health risks associated with surfing during dry weather and storm conditions. The results showed that the level of gastrointestinal illness risks from surfing post-storm events was elevated, with the probability of exceeding the US EPA health risk guideline up to 28% of the time. The surfing risk was also elevated in comparison with swimming at the same beach due to ingestion of greater volume of water. The study suggests that refinement of dose-response model, improving monitoring practice and better surfer behavior surveillance will improve the risk estimation.

  6. PURE: a web-based decision support system to evaluate pesticide environmental risk for sustainable pest management practices in California.

    PubMed

    Zhan, Yu; Zhang, Minghua

    2012-08-01

    Farmers, policy makers, and other stakeholders seek tools to quantitatively assess pesticide risks for mitigating pesticide impacts on ecosystem and human health. This paper presents the Pesticide Use Risk Evaluation (PURE) decision support system (DSS) for evaluating site-specific pesticide risks to surface water, groundwater, soil, and air across pesticide active ingredient (AI), pesticide product, and field levels. The risk score is determined by the ratio of the predicted environmental concentrations (PEC) to the toxicity value for selected endpoint organism(s); except that the risk score for the air is calculated using the emission potential (EP), which is a pesticide product property for estimating potential volatile organic compound (VOC) emissions by California Environmental Protection Agency (CEPA). The risk scores range from 0 to 100, where 0 represents negligible risk while 100 means the highest risk. The procedure for calculating PEC in surface water was evaluated against monitoring data for 41 pesticide AIs, with a statistically significant correlation coefficient of r=0.82 (p<0.001). In addition, two almond fields in the Central Valley, California were evaluated for pesticide risks as a case study, where the commonly acknowledged high-risk pesticides gained high risk scores. Simazine, one of the most frequently detected pesticides in groundwater, was scored as 74 (the moderate high risk class) to groundwater; and chlorpyrifos, one of the frequently detected pollutants in surface water, was scored as 100 (the high risk class) to surface water. In support of pesticide risk quantitative assessment and use of reduced-risk pesticide selection, the PURE-DSS can be useful to assist growers, pesticide control advisors, and environmental protection organizations in mitigating pesticide use impacts on the environment.

  7. Quantitative evaluation of strategies for erosion control on a railway embankment batter

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Y.; Sibley, J.; Ashwath, N.

    2001-12-01

    Strategies for erosion control on a railway embankment batter (side slope) are quantitatively evaluated in this paper. The strategies were centred on control (do nothing treatment), grass seeding, gypsum application, jute mat (an erosion control blanket) placement and planting hedgerows of Monto vetiver grass. Rainfall and runoff were monitored at 1 min intervals on 10 m wide embankment batter plots during 1998 and 1999. Total bedload and suspended sediment eroded from the plots were also measured but only for a group of storm events within sampling intervals. It has been demonstrated that vetiver grass is not cost-effective in controlling erosion on railway batters within Central Queensland region. Seeding alone could cause 60% reduction in the erosion rate compared with the control treatment. Applying gypsum to the calcium-deficient soil before seeding yielded an additional 25% reduction in the erosion rate. This is the result, primarily, of 100% grass cover establishment within seven months of sowing. Therefore, for railway embankment batter erosion control, the emphasis needs to be on rapid establishment of 100% grass cover. For rapid establishment of grass cover, irrigation is necessary during the initial stages of growth as the rainfall is unpredictable and the potential evaporation exceeds rainfall in the study region. The risk of seeds and fertilizers being washed out by short-duration and high-intensity rainfall events during the establishment phase may be reduced by the use of erosion control blankets on sections of the batters. Accidental burning of grasses on some plots caused serious erosion problems, resulting in very slow recovery of grass growth. It is therefore recommended that controlled burning of grasses on railway batters should be avoided to protect batters from being exposed to severe erosion.

  8. Dynamic risk factors: the Kia Marama evaluation.

    PubMed

    Hudson, Stephen M; Wales, David S; Bakker, Leon; Ward, Tony

    2002-04-01

    Risk assessment is an essential part of clinical practice. Each of the three aspects of risk (static, stable, and acute dynamic) are important at various points of contact between the man and the systems that are responsible for providing service. Dynamic factors, the typical treatment and supervision targets, have received less research attention than static factors. This paper examined the extent to which pretreatment, posttreatment and change scores were associated with reoffending among men incarcerated for sexually molesting. The results were generally supportive of change in prooffending attitudes as the key to not reoffending and suggested that the perspective-taking component of empathy and the use of fantasy may be important mechanisms. Affect scales generally failed to show any relationship with reoffending, outside decreases in trait and suppressed anger. Moreover, these data suggest that we could improve our assessments and treatment through increased sensitivity to offense pathways.

  9. A framework for risk-benefit evaluations in biomedical research.

    PubMed

    Rid, Annette; Wendler, David

    2011-06-01

    Essentially all guidelines and regulations require that biomedical research studies have an acceptable risk-benefit profile. However, these documents offer little concrete guidance for implementing this requirement and determining when it is satisfied. As a result, those charged with risk-benefit evaluations currently assess the risk-benefit profile of biomedical research studies in unsystematic ways, raising concern that some research participants are not being protected from excessive risks and that some valuable studies involving acceptable risk are being rejected. The present paper aims to address this situation by delineating the first comprehensive framework, which is based on existing guidelines and regulations as well as the relevant literature, for risk-benefit evaluations in biomedical research.

  10. Quantitative performance evaluation of the EM algorithm applied to radiographic images

    NASA Astrophysics Data System (ADS)

    Brailean, James C.; Giger, Maryellen L.; Chen, Chin-Tu; Sullivan, Barry J.

    1991-07-01

    In this study, the authors evaluate quantitatively the performance of the Expectation Maximization (EM) algorithm as a restoration technique for radiographic images. The 'perceived' signal-to-nose ratio (SNR), of simple radiographic patterns processed by the EM algorithm are calculated on the basis of a statistical decision theory model that includes both the observer's visual response function and a noise component internal to the eye-brain system. The relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to quantitatively compare the effects of the EM algorithm to two popular image enhancement techniques: contrast enhancement (windowing) and unsharp mask filtering.

  11. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future.

  12. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  13. Evaluation of volcanic risk management in Merapi and Bromo Volcanoes

    NASA Astrophysics Data System (ADS)

    Bachri, S.; Stöetter, J.; Sartohadi, J.; Setiawan, M. A.

    2012-04-01

    Merapi (Central Java Province) and Bromo (East Java Province) volcanoes have human-environmental systems with unique characteristics, thus causing specific consequences on their risk management. Various efforts have been carried out by many parties (institutional government, scientists, and non-governmental organizations) to reduce the risk in these areas. However, it is likely that most of the actions have been done for temporary and partial purposes, leading to overlapping work and finally to a non-integrated scheme of volcanic risk management. This study, therefore, aims to identify and evaluate actions of risk and disaster reduction in Merapi and Bromo Volcanoes. To achieve this aims, a thorough literature review was carried out to identify earlier studies in both areas. Afterward, the basic concept of risk management cycle, consisting of risk assessment, risk reduction, event management and regeneration, is used to map those earlier studies and already implemented risk management actions in Merapi and Bromo. The results show that risk studies in Merapi have been developed predominantly on physical aspects of volcanic eruptions, i.e. models of lahar flows, hazard maps as well as other geophysical modeling. Furthermore, after the 2006 eruption of Merapi, research such on risk communication, social vulnerability, cultural vulnerability have appeared on the social side of risk management research. Apart from that, disaster risk management activities in the Bromo area were emphasizing on physical process and historical religious aspects. This overview of both study areas provides information on how risk studies have been used for managing the volcano disaster. This result confirms that most of earlier studies emphasize on the risk assessment and only few of them consider the risk reduction phase. Further investigation in this field work in the near future will accomplish the findings and contribute to formulate integrated volcanic risk management cycles for both

  14. Trust-level risk evaluation and risk control guidance in the NHS East of England.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-08-01

    In recent years, the healthcare sector has adopted the use of operational risk assessment tools to help understand the systems issues that lead to patient safety incidents. But although these problem-focused tools have improved the ability of healthcare organizations to identify hazards, they have not translated into measurable improvements in patient safety. One possible reason for this is a lack of support for the solution-focused process of risk control. This article describes a content analysis of the risk management strategies, policies, and procedures at all acute (i.e., hospital), mental health, and ambulance trusts (health service organizations) in the East of England area of the British National Health Service. The primary goal was to determine what organizational-level guidance exists to support risk control practice. A secondary goal was to examine the risk evaluation guidance provided by these trusts. With regard to risk control, we found an almost complete lack of useful guidance to promote good practice. With regard to risk evaluation, the trusts relied exclusively on risk matrices. A number of weaknesses were found in the use of this tool, especially related to the guidance for scoring an event's likelihood. We make a number of recommendations to address these concerns. The guidance assessed provides insufficient support for risk control and risk evaluation. This may present a significant barrier to the success of risk management approaches in improving patient safety.

  15. Evaluation of a quantitative clinical method for assessment of sensory skin irritation.

    PubMed

    Robinson, M K; Perkins, M A

    2001-10-01

    Sensory skin irritation refers to the myriad of symptomatic complaints (e.g., sting and burn) frequently associated with inflammatory skin conditions or skin intolerance to various chemicals or finished products. Sensory irritation is an important factor in consumer acceptance of the products that they buy and use; however, from a safety testing and risk assessment standpoint, it has been difficult to evaluate. Recently, methods have been developed to more quantitatively assess sensory irritation using a semantically-labeled scale of sensation intensity, the labeled magnitude (LM) scale. Using this device, studies were conducted to determine if test subjects' perceptions of recalled or imagined sensory responses (from a series of survey questions) were related to their actual sensory reactivity to chemical challenge. Subjects were presented with 15 skin sensation scenarios of varying intensities and asked to record their self-perceived recalled or imagined responses using the LM scale. Individual and mean responses to each of the 15 survey questions were compared within and across studies. Considerable variation was seen between subjects' responses to the questions, particularly for questions pertaining to stronger stimuli (e.g., scalding water or skin lacerations). There was also little consistency seen in the pattern of individual responses across the questions. However, among 4 different study populations, the group mean scores for each of the 15 survey questions showed a high degree of consistency. Also, in spite of the variability in perceived responses to the recalled/imagined skin sensations, statistically significant dose-response and time-response patterns were observed in chemical (lactic acid and capsaicin) challenge studies. In one capsaicin study, a direct relationship was observed, among 83% of the study subjects, between the mean recall intensity scores and actual responses to subsequent capsaicin challenge. This pattern was not seen in a lactic acid

  16. Cytochrome P450-mediated metabolic alterations in preeclampsia evaluated by quantitative steroid signatures.

    PubMed

    Moon, Ju-Yeon; Moon, Myeong Hee; Kim, Ki Tae; Jeong, Dae Hoon; Kim, Young Nam; Chung, Bong Chul; Choi, Man Ho

    2014-01-01

    Although preeclampsia has been suggested potential risk factors including placental and systemic inflammation, oxidative stress, and abnormal steroid metabolism during pregnancy, the pathogenesis of preeclampsia has not fully been elucidated, particularly in steroid metabolism. The association between various cytochrome P450 (CYP)-mediated steroid metabolic markers and preeclampsia risk was therefore investigated. The serum levels of 54 CYP-mediated regioselective hydroxysteroids and their substrates were quantitatively evaluated from both pregnant women with preeclampsia (n=30; age, 30.8±4.5 years) and normotensive controls (n=30; age, 31.0±3.5 years), who were similar with respect to maternal age, gestational age, and body mass index. The levels of 6ß-, 7a-, and 11ß-hydroxymetabolites of androgens and corticoids were significantly increased in women with preeclampsia. In addition, the levels of oxysterols, including 7a-, 7ß-, 4ß-, 20a-, 24S-, and 27-hydroxycholesterol, were markedly higher, while the levels of 16a-OH-DHEA, 16a-OH-androstenedione, and cholesterol were significantly decreased in patients. The 6ß-hydroxylation of androgens and corticoids by CYP3A4 (P<0.01), the activation of 20,22-desmolase (a cholesterol side-chain cleavage enzyme) by CYP11A1 (P<0.00001), and the multi-hydroxylation of cholesterol at C-4ß, C-7a, C-7ß, C-24S, C-27, and C-20a (P<0.0001) by catalytic or enzymatic reaction (e.g. CYP3A4, CYP7A1, CYP27A1, and CYP46A1) were differed between preeclamptic women and control subjects. In particular, an increased oxysterols (induction>2.0-fold) were positively correlated with the conditions of preeclampsia. Our metabolic profiling suggests the CYP-mediated alterations in steroid metabolism and hydroxylation in pregnancy-induced hypertension. These multiple markers could serve as background information for improved clinical diagnosis and management during pregnancy. This article is part of a Special Issue entitled "Pregnancy and

  17. Feasibility of an automated quantitative computed tomography angiography-derived risk score for risk stratification of patients with suspected coronary artery disease.

    PubMed

    de Graaf, Michiel A; Broersen, Alexander; Ahmed, Wehab; Kitslaar, Pieter H; Dijkstra, Jouke; Kroft, Lucia J; Delgado, Victoria; Bax, Jeroen J; Reiber, Johan H C; Scholte, Arthur J

    2014-06-15

    Coronary computed tomography angiography (CTA) has important prognostic value. Additionally, quantitative CTA (QCT) provides a more detailed accurate assessment of coronary artery disease (CAD) on CTA. Potentially, a risk score incorporating all quantitative stenosis parameters allows accurate risk stratification. Therefore, the purpose of this study was to determine if an automatic quantitative assessment of CAD using QCT combined into a CTA risk score allows risk stratification of patients. In 300 patients, QCT was performed to automatically detect and quantify all lesions in the coronary tree. Using QCT, a novel CTA risk score was calculated based on plaque extent, severity, composition, and location on a segment basis. During follow-up, the composite end point of all-cause mortality, revascularization, and nonfatal infarction was recorded. In total, 10% of patients experienced an event during a median follow-up of 2.14 years. The CTA risk score was significantly higher in patients with an event (12.5 [interquartile range 8.6 to 16.4] vs 1.7 [interquartile range 0 to 8.4], p <0.001). In 127 patients with obstructive CAD (≥50% stenosis), 27 events were recorded, all in patients with a high CTA risk score. In conclusion, the present study demonstrated that a fully automatic QCT analysis of CAD is feasible and can be applied for risk stratification of patients with suspected CAD. Furthermore, a novel CTA risk score incorporating location, severity, and composition of coronary lesion was developed. This score may improve risk stratification but needs to be confirmed in larger studies.

  18. EVALUATING RISK IN OLDER ADULTS USING PHYSIOLOGICALLY BASED PHARMACOKINETIC MODELS

    EPA Science Inventory

    The rapid growth in the number of older Americans has many implications for public health, including the need to better understand the risks posed by environmental exposures to older adults. An important element for evaluating risk is the understanding of the doses of environment...

  19. Credit Risk Evaluation of Power Market Players with Random Forest

    NASA Astrophysics Data System (ADS)

    Umezawa, Yasushi; Mori, Hiroyuki

    A new method is proposed for credit risk evaluation in a power market. The credit risk evaluation is to measure the bankruptcy risk of the company. The power system liberalization results in new environment that puts emphasis on the profit maximization and the risk minimization. There is a high probability that the electricity transaction causes a risk between companies. So, power market players are concerned with the risk minimization. As a management strategy, a risk index is requested to evaluate the worth of the business partner. This paper proposes a new method for evaluating the credit risk with Random Forest (RF) that makes ensemble learning for the decision tree. RF is one of efficient data mining technique in clustering data and extracting relationship between input and output data. In addition, the method of generating pseudo-measurements is proposed to improve the performance of RF. The proposed method is successfully applied to real financial data of energy utilities in the power market. A comparison is made between the proposed and the conventional methods.

  20. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  1. Evaluating microcystin exposure risk through fish consumption.

    PubMed

    Poste, Amanda E; Hecky, Robert E; Guildford, Stephanie J

    2011-07-01

    Microcystin is a cyanobacterial hepatotoxin that is found worldwide, and poses a serious threat to the ecological communities in which it is found as well as to those who rely on these waters for drinking, sanitation, or as a food source. Microcystin is known to accumulate in fish and other aquatic biota, however the prevalence of microcystin in fish tissue and the human health risks posed by microcystin exposure through fish consumption remain poorly resolved. Here we show that microcystin is pervasive in water and fish from several tropical (Ugandan) and temperate (North American) lakes, including lakes that support some of the largest freshwater fisheries in the world. We establish that fish consumption can be an important and sometimes dominant route of microcystin exposure for humans, and can cause consumers to exceed recommended total daily intake guidelines for microcystin. These results highlight the importance of monitoring microcystin concentrations in fish, and the need to consider potential exposure to microcystin through fish consumption in order to adequately assess human exposure risk.

  2. Evaluating microcystin exposure risk through fish consumption

    PubMed Central

    Poste, Amanda E.; Hecky, Robert E.; Guildford, Stephanie J.

    2011-01-01

    Microcystin is a cyanobacterial hepatotoxin that is found worldwide, and poses a serious threat to the ecological communities in which it is found as well as to those who rely on these waters for drinking, sanitation, or as a food source. Microcystin is known to accumulate in fish and other aquatic biota, however the prevalence of microcystin in fish tissue and the human health risks posed by microcystin exposure through fish consumption remain poorly resolved. Here we show that microcystin is pervasive in water and fish from several tropical (Ugandan) and temperate (North American) lakes, including lakes that support some of the largest freshwater fisheries in the world. We establish that fish consumption can be an important and sometimes dominant route of microcystin exposure for humans, and can cause consumers to exceed recommended total daily intake guidelines for microcystin. These results highlight the importance of monitoring microcystin concentrations in fish, and the need to consider potential exposure to microcystin through fish consumption in order to adequately assess human exposure risk. PMID:21671629

  3. Quantitative Impact of Cardiovascular Risk Factors and Vascular Closure Devices on the Femoral Artery after Repeat Cardiac Catheterization

    PubMed Central

    Tiroch, Klaus A.; Matheny, Michael E.; Resnic, Frederic S.

    2010-01-01

    Background We evaluated the exact quantitative long-term impact of repeated catheterizations, vascular closure devices (VCDs) and cardiovascular risk factors on the femoral artery after cardiac catheterization. Methods A total of 2,102 available femoral angiograms from 827 consecutive patients were analyzed using caliper-based quantitative vascular analysis (QVA). These patients underwent coronary interventions between 01/2005-04/2007, and had at least one additional catheterization procedure through the ipsilateral femoral access site from 12/2001 until 01/2008. Multivariate analysis was performed to control for confounding variables. The primary outcome was change in artery size. Results The average punctured artery diameter was 6.5mm±2.1mm. The average time between first case and last follow-up was 349 days. There was no significant change of the punctured artery size over time after the index procedure (P=0.15) and no change associated with the use of VCDs (P=0.25) after multivariate analysis. Smaller arteries were associated with female gender (−1.22mm, P<0.0001), presence of angiographic peripheral vascular disease (PVD, −1.19mm, P<0.0001), and current (−0.48mm, P=0.001) or former (−0.23mm, P=0.01) smoking status, while previous statin therapy was associated with an increase in artery size (+0.47mm, P<0.0001). VCDs were used less often compared to manual compression in cases preceding the first detection of angiographic PVD (P<0.001). Conclusion VCDs are not associated with a change in the artery size or progression of PVD. Overall, there is no change in vessel size over time after repeat catheterizations, with a decrease in vessel size associated with current and former smoking, and an increase with previous statin therapy. PMID:20102878

  4. Agreement between quantitative microbial risk assessment and epidemiology at low doses during waterborne outbreaks of protozoan disease

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative microbial risk assessment (QMRA) is a valuable complement to epidemiology for understanding the health impacts of waterborne pathogens. The approach works by extrapolating available data in two ways. First, dose-response data are typically extrapolated from feeding studies, which use ...

  5. Application of quantitative stereology to the evaluation of enzyme-altered foci in rat liver.

    PubMed

    Campbell, H A; Pitot, H C; Potter, V R; Laishes, B A

    1982-02-01

    The mathematical science of quantitative stereology has established relationships for the quantitation of elements in three-dimensional space from observations on two-dimensional planes. This report describes the utilization and importance of such mathematical relationships for the quantitative analysis of focal hepatic lesions in terms relative to the volume of the liver. Three examples are utilized to demonstrate the utility of such calculations in the three-dimensional quantitation of hepatic focal lesions. The first is that of a computer-simulated experiment based on defined hypothetical situations. The simulations demonstrate the applicability of the computations described in this report to the evaluation of two-dimensional data from typical animal experiments. The other two examples are taken from actual experiments and involve the transplantation of hepatic cell populations into the liver suitably prepared hosts and the quantitation of altered foci produced by initiation with diethylnitrosamine-partial hepatectomy followed by promotion with phenobarbital. The quantitation of altered foci by means of a two-dimensional analysis (simple enumeration of focal intersections/area of tissue section) is proportional to the quantitation of foci per volume of liver provided that the mean diameter of the foci for each treatment is sufficiently uniform, as exemplified in the text by the transplantation experiment. When such mean diameters are unequal as in the diethylnitrosamine-phenobarbital experiment described herein, quantitation from three-dimensional analysis gives significantly different results as compared with enumeration of focal intersections on two-dimensional areas. These studies clearly demonstrate that the frequency and size of foci intersections viewed on two-dimensional tissue sections do not necessarily reflect the number of size of foci in the three-dimensional tissue. Only by quantitating the number and size of the foci in relation to the three

  6. Investigation of the Genetic Association between Quantitative Measures of Psychosis and Schizophrenia: A Polygenic Risk Score Analysis

    PubMed Central

    Ripke, Stephan; Kahn, Rene S.; Ophoff, Roel A.

    2012-01-01

    The presence of subclinical levels of psychosis in the general population may imply that schizophrenia is the extreme expression of more or less continuously distributed traits in the population. In a previous study, we identified five quantitative measures of schizophrenia (positive, negative, disorganisation, mania, and depression scores). The aim of this study is to examine the association between a direct measure of genetic risk of schizophrenia and the five quantitative measures of psychosis. Estimates of the log of the odds ratios of case/control allelic association tests were obtained from the Psychiatric GWAS Consortium (PGC) (minus our sample) which included genome-wide genotype data of 8,690 schizophrenia cases and 11,831 controls. These data were used to calculate genetic risk scores in 314 schizophrenia cases and 148 controls from the Netherlands for whom genotype data and quantitative symptom scores were available. The genetic risk score of schizophrenia was significantly associated with case-control status (p<0.0001). In the case-control sample, the five psychosis dimensions were found to be significantly associated with genetic risk scores; the correlations ranged between.15 and.27 (all p<.001). However, these correlations were not significant in schizophrenia cases or controls separately. While this study confirms the presence of a genetic risk for schizophrenia as categorical diagnostic trait, we did not find evidence for the genetic risk underlying quantitative schizophrenia symptom dimensions. This does not necessarily imply that a genetic basis is nonexistent, but does suggest that it is distinct from the polygenic risk score for schizophrenia. PMID:22761660

  7. Quantitative analysis of the layer separation risk in bilayer tablets using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro; Iwasaki, Norio; Terada, Katsuhide

    2013-08-16

    Layer separation is a critical defect in many bilayer tablets. Despite its importance for product quality, few studies have investigated its root cause. We evaluated bilayer tablets with varying layer separation tendencies using terahertz pulsed imaging (TPI) in comparison with other analytical methods such as tensile strength measurements, friability testing, scanning electron microscopy (SEM), and X-ray computed tomography (XRCT). The layer separation risk was determined by friability testing and shown to be correlated with the final compression pressure used for bilayer tablet fabrication. TPI could nondestructively detect cracks between the component layers that lead to layer separation. The adhesion integrity of the interface was quantified by the interface index, a unique value derived from the time-domain terahertz waveform. The interface index showed good correlation to the layer separation tendency and could distinguish interface quality among seven batches of bilayer tablets. In contrast, SEM and XRCT detected structural defects but could not distinguish batches with high or low layer separation risk. TPI revealed the relationship between compression pressure and interface quality. Thus, TPI can aid in quality control by providing a precise estimate of the layer separation risk and robust quality of bilayer tablet development with better understanding of layer separation.

  8. Assessing vertebral fracture risk on volumetric quantitative computed tomography by geometric characterization of trabecular bone structure

    NASA Astrophysics Data System (ADS)

    Checefsky, Walter A.; Abidin, Anas Z.; Nagarajan, Mahesh B.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2016-03-01

    The current clinical standard for measuring Bone Mineral Density (BMD) is dual X-ray absorptiometry, however more recently BMD derived from volumetric quantitative computed tomography has been shown to demonstrate a high association with spinal fracture susceptibility. In this study, we propose a method of fracture risk assessment using structural properties of trabecular bone in spinal vertebrae. Experimental data was acquired via axial multi-detector CT (MDCT) from 12 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. Common image processing methods were used to annotate the trabecular compartment in the vertebral slices creating a circular region of interest (ROI) that excluded cortical bone for each slice. The pixels inside the ROI were converted to values indicative of BMD. High dimensional geometrical features were derived using the scaling index method (SIM) at different radii and scaling factors (SF). The mean BMD values within the ROI were then extracted and used in conjunction with a support vector machine to predict the failure load of the specimens. Prediction performance was measured using the root-mean-square error (RMSE) metric and determined that SIM combined with mean BMD features (RMSE = 0.82 +/- 0.37) outperformed MDCT-measured mean BMD (RMSE = 1.11 +/- 0.33) (p < 10-4). These results demonstrate that biomechanical strength prediction in vertebrae can be significantly improved through the use of SIM-derived texture features from trabecular bone.

  9. Combinative Method Using Multi-components Quantitation and HPLC Fingerprint for Comprehensive Evaluation of Gentiana crassicaulis

    PubMed Central

    Song, Jiuhua; Chen, Fengzheng; Liu, Jiang; Zou, Yuanfeng; Luo, Yun; Yi, Xiaoyan; Meng, Jie; Chen, Xingfu

    2017-01-01

    Background: Gentiana crassicaulis () is an important traditional Chinese herb. Like other herbs, its chemical compounds vary greatly by the environmental and genetic factors, as a result, the quality is always different even from the same region, and therefore, the quality evaluation is necessary for its safety and effective use. In this study, a comprehensive method including HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Cujingqinjiao and to classify the samples collected from Lijiang City of Yunnan province. A total of 30 common peaks including four identified peaks, were found, and were involved for further characterization and quality control of Cujingqinjiao. Twenty-one batches of samples from Lijiang City of Yunnan Province were evaluated by similarity analysis (SA), hierarchical cluster analysis (HCA), principal component analysis (PCA) and factor analysis (FA) according to the characteristic of common peaks. Results: The obtained data showed good stability and repeatability of the chromatographic fingerprint, similarity values were all more than 0.90. This study demonstrated that a combination of the chromatographic quantitative analysis and fingerprint offered an efficient way to quality consistency evaluation of Cujingqinjiao. Consistent results were obtained to show that samples from a same origin could be successfully classified into two groups. Conclusion: This study revealed that the combinative method was reliable, simple and sensitive for fingerprint analysis, moreover, for quality control and pattern recognition of Cujingqinjiao. SUMMARY HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Gentiana crassicaulisSimilarity analysis, hierarchical cluster analysis, principal component analysis and factor analysis were employed to analysis the chromatographic dataset.The results of multi-components quantitation analysis, similarity analysis, hierarchical cluster analysis, principal

  10. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness

  11. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses.

    PubMed

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-08-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10(-11)) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α 1 = 1, α 2 = 91; α 1 = 1, α 2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be -2.35 and -2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10(-14) and 3.58×10(-14), respectively. These results indicate that probability of C. perfringens foodborne illness

  12. Presenting quantitative information about decision outcomes: a risk communication primer for patient decision aid developers

    PubMed Central

    2013-01-01

    Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and

  13. A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis

    PubMed Central

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174

  14. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    PubMed

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  15. Risk evaluation and monitoring in multiple sclerosis therapeutics

    PubMed Central

    Wolinsky, Jerry S; Ashton, Raymond J; Hartung, Hans-Peter; Reingold, Stephen C

    2014-01-01

    Background: Risk for multiple sclerosis (MS) disease-modifying therapies (DMT) must be assessed on an ongoing basis. Early concerns regarding the first-approved DMTs for MS have been mitigated, but recently licensed therapies have been linked to possibly greater risks. Objectives: The objective of this review is to discuss risk assessment in MS therapeutics based on an international workshop and comprehensive literature search and recommend strategies for risk assessment/monitoring. Results: Assessment and perception of therapeutic risks vary between patients, doctors and regulators. Acceptability of risk depends on the magnitude of risk and the demonstrated clinical benefits of any agent. Safety signals must be distinguishable from chance occurrences in a clinical trial and in long-term use of medications. Post-marketing research is crucial for assessing longer-term safety in large patient cohorts. Reporting of adverse events is becoming more proactive, allowing more rapid identification of risks. Communication about therapeutic risks and their relationship to clinical benefit must involve patients in shared decision making. Conclusions: It is difficult to produce a general risk-assessment algorithm for all MS therapies. Specific algorithms are required for each DMT in every treated-patient population. New and evolving risks must be evaluated and communicated rapidly to allow patients and physicians to be well informed and able to share treatment decisions. PMID:24293456

  16. Primer for evaluating ecological risk at petroleum release sites.

    PubMed

    Claff, R

    1999-02-01

    Increasingly, risk-based approaches are being used to guide decision making at sites such as service stations and petroleum product terminals, where petroleum products have been inadvertently released to the soil. For example, the API Decision Support System software, DSS, evaluates site human health risk along six different routes of exposure. The American Society for Testing and Materials' Risk-Based Corrective Action (RBCA) standard, ASTM 1739, establishes a tiered framework for evaluating petroleum release sites on the basis of human health risk. Though much of the risk assessment focus has been on human health risk, regulatory agencies recognize that protection of human health may not fully protect the environment; and EPA has developed guidance on identifying ecological resources to be protected through risk-based decision making. Not every service station or petroleum product terminal site warrants a detailed ecological risk assessment. In some cases, a simple preliminary assessment will provide sufficient information for decision making. Accordingly, the American Petroleum Institute (API) is developing a primer for site managers, to assist them in conducting this preliminary assessment, and in deciding whether more detailed ecological risk assessments are warranted. The primer assists the site manager in identifying relevant ecological receptors and habitats, in identifying chemicals and exposure pathways of concern, in developing a conceptual model of the site to guide subsequent actions, and in identifying conditions that may warrant immediate response.

  17. The evaluation of volcanic risk in the Vesuvian area

    NASA Astrophysics Data System (ADS)

    Scandone, Roberto; Arganese, Giovanni; Galdi, Flavio

    1993-11-01

    Volcanic Risk has been defined as the product: R = Value × Vulnerability × Hazard, where value is the total amount of lives or properties at risk for a volcani eruption, the vulnerability is the percentage of value at risk for a given volcanic event, and the hazard is the probability that a given area may be affected by a certain volcanic phenomenon. We used this definition to evaluate the Risk of loss of human lives for volcanic eruptions of Vesuvius. Value has been determined based on the total number of inhabitants living in areas that could be affected by an eruption. Vulnerability is based on the relative probability of deaths as a result of different volcanic phenomena (tephra fall, pyroclastic flows, etc.). Hazard is evaluated based on the absolute probability of a given phenomenon in a certain area. This last parameter is the most difficult to evaluate. We subdivided the activity of Vesuvius, that produces risk of loss of human lives, into three classes of eruptions, based on the Volcanic Explosivity Index. We assume that the events of each class are distributed according to a poissonian distribution (this is demonstrated for VEI = 3, and inferred for the other classes), so that we can evaluate the absolute probability of an eruption for each class within a given time span. We use a time window of 10 years and evaluate the probabilities of occurrence of at least one eruption for VEI = 3, 4, 5; the probability is respectively: P3 = 0.09896, P4 = 0.01748, P5 = 0.00298 We have made a hazard evaluation for the entire Vesuvian area as well as an evaluation of Volcanic Risk. The obtained map shows that the areas with higher risk are on the southern side of Vesuvius, in the coastal region where each town is characterized by an average Risk of ˜ 1000 inhabitants/10 years. The risk regularly decreases with increasing distance from the volcano. The risk is mostly due to the events with VEI = 3 and 4, as the most destructive effects of VEI = 5 are counterbalanced

  18. Computerized Ultrasound Risk Evaluation (CURE): First Clinical Results

    NASA Astrophysics Data System (ADS)

    Duric, N.; Littrup, P.; Rama, O.; Holsapple, E.

    The Karmanos Cancer Institute has developed an ultrasound (US) tomography system, known as Computerized Ultrasound Risk Evaluation (CURE), for detecting and evaluating breast cancer, with the eventual goal of providing improved differentiation of benign masses from cancer. We report on our first clinical findings with CURE.

  19. Food and Drug Administration Evaluation and Cigarette Smoking Risk Perceptions

    ERIC Educational Resources Information Center

    Kaufman, Annette R.; Waters, Erika A.; Parascandola, Mark; Augustson, Erik M.; Bansal-Travers, Maansi; Hyland, Andrew; Cummings, K. Michael

    2011-01-01

    Objectives: To examine the relationship between a belief about Food and Drug Administration (FDA) safety evaluation of cigarettes and smoking risk perceptions. Methods: A nationally representative, random-digit-dialed telephone survey of 1046 adult current cigarette smokers. Results: Smokers reporting that the FDA does not evaluate cigarettes for…

  20. Evaluation of cluster recovery for small area relative risk models.

    PubMed

    Rotejanaprasert, Chawarat

    2014-12-01

    The analysis of disease risk is often considered via relative risk. The comparison of relative risk estimation methods with "true risk" scenarios has been considered on various occasions. However, there has been little examination of how well competing methods perform when the focus is clustering of risk. In this paper, a simulated evaluation of a range of potential spatial risk models and a range of measures that can be used for (a) cluster goodness of fit, (b) cluster diagnostics are considered. Results suggest that exceedence probability is a poor measure of hot spot clustering because of model dependence, whereas residual-based methods are less model dependent and perform better. Local deviance information criteria measures perform well, but conditional predictive ordinate measures yield a high false positive rate.

  1. Evaluation of quantitative accuracy in CZT-based pre-clinical SPECT for various isotopes

    NASA Astrophysics Data System (ADS)

    Park, S.-J.; Yu, A. R.; Kim, Y.-s.; Kang, W.-S.; Jin, S. S.; Kim, J.-S.; Son, T. J.; Kim, H.-J.

    2015-05-01

    In vivo pre-clinical single-photon emission computed tomography (SPECT) is a valuable tool for functional small animal imaging, but several physical factors, such as scatter radiation, limit the quantitative accuracy of conventional scintillation crystal-based SPECT. Semiconductor detectors such as CZT overcome these deficiencies through superior energy resolution. To our knowledge, little scientific information exists regarding the accuracy of quantitative analysis in CZT-based pre-clinical SPECT systems for different isotopes. The aim of this study was to assess the quantitative accuracy of CZT-based pre-clinical SPECT for four isotopes: 201Tl, 99mTc, 123I, and 111In. The quantitative accuracy of the CZT-based Triumph X-SPECT (Gamma-Medica Ideas, Northridge, CA, U.S.A.) was compared with that of a conventional SPECT using GATE simulation. Quantitative errors due to the attenuation and scatter effects were evaluated for all four isotopes with energy windows of 5%, 10%, and 20%. A spherical source containing the isotope was placed at the center of the air-or-water-filled mouse-sized cylinder phantom. The CZT-based pre-clinical SPECT was more accurate than the conventional SPECT. For example, in the conventional SPECT with an energy window of 10%, scatter effects degraded quantitative accuracy by up to 11.52%, 5.10%, 2.88%, and 1.84% for 201Tl, 99mTc, 123I, and 111In, respectively. However, with the CZT-based pre-clinical SPECT, the degradations were only 9.67%, 5.45%, 2.36%, and 1.24% for 201Tl, 99mTc, 123I, and 111In, respectively. As the energy window was increased, the quantitative errors increased in both SPECT systems. Additionally, the isotopes with lower energy of photon emissions had greater quantitative error. Our results demonstrated that the CZT-based pre-clinical SPECT had lower overall quantitative errors due to reduced scatter and high detection efficiency. Furthermore, the results of this systematic assessment quantifying the accuracy of these SPECT

  2. Quantitative evaluation of desertification extent based on geographic unit by remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Zhoulong; Wang, Dapeng; Zhang, Chunlai; Zhang, Anding

    2007-06-01

    The quantitative evaluation of desertification extent with remotely sensed imagery has been a hot spot of remote sensing application research. The evaluation process should consider the principles of dominance, integration and so on. Traditional evaluation methods to desertification extent are usually carried out at the scale of discrete pixels, which fails to taken into account of the influence of adjacent pixels and results in noises on the evaluation result images, inducing the unilateralism result. If we try to use filters to reduce the noises, then the evaluation results will be wrong contrasting with its real result. Based on former researches and the geographic science principle, this paper discusses the method of assessing desertification extent at the scale of geographic unit, in which the geographic unit is determined by vegetation coverage index and spatial information. The test results show that this method provides more accurate assessment of the ground situation avoiding the limitations of traditional methods.

  3. Injury risk evaluation in sport climbing.

    PubMed

    Neuhof, A; Hennig, F F; Schöffl, I; Schöffl, V

    2011-10-01

    The aim of this study was to quantify and rate acute sport climbing injuries. Acute sport climbing injuries occurring from 2002 to 2006 were retrospectively assessed with a standardized web based questionnaire. A total number of 1962 climbers reported 699 injuries, which is equivalent to 0.2 injuries per 1 000 h of sport participation. Most (74.4%) of the injuries were of minor severity rated NACA I or NACA II. Injury distribution between the upper (42.6%) and lower extremities (41.3%) was similar, with ligament injuries, contusions and fractures being the most common injury types. Years of climbing experience (p<0.01), difficulty level (p<0.01), climbing time per week during summer (p<0.01) and winter (p<0.01) months were correlated with the injury rate. Age (p<0.05 (p=0.034)), years of climbing experience (p<0.01) and average climbing level (p<0.01) were correlated to the injury severity rated through NACA scores. The risk of acute injuries per 1 000 h of sport participation in sport climbing was lower than in previous studies on general rock climbing and higher than in studies on indoor climbing. In order to perform inter-study comparisons of future studies on climbing injuries, the use of a systematic and standardized scoring system (UIAA score) is essential.

  4. [Drifts and pernicious effects of the quantitative evaluation of research: the misuse of bibliometrics].

    PubMed

    Gingras, Yves

    2015-06-01

    The quantitative evaluation of scientific research relies increasingly on bibliometric indicators of publications and citations. We present the issues raised by the simplistic use of these methods and recall the dangers of using poorly built indicators and technically defective rankings that do not measure the dimensions they are supposed to measure, for example the of publications, laboratories or universities. We show that francophone journals are particularly susceptible to suffer from the bad uses of too simplistic bibliometric rankings of scientific journals.

  5. Evaluation of a quantitative plasma PCR plate assay for detecting cytomegalovirus infection in marrow transplant recipients.

    PubMed Central

    Gallez-Hawkins, G M; Tegtmeier, B R; ter Veer, A; Niland, J C; Forman, S J; Zaia, J A

    1997-01-01

    A plasma PCR test, using a nonradioactive PCR plate assay, was evaluated for detection of human cytomegalovirus reactivation. This assay was compared to Southern blotting and found to perform well. As a noncompetitive method of quantitation, it was similar to a competitive method for detecting the number of genome copies per milliliter of plasma in marrow transplant recipients. This is a technically simplified assay with potential for adaptation to automation. PMID:9041438

  6. Improved methods for modelling drinking water treatment in quantitative microbial risk assessment; a case study of Campylobacter reduction by filtration and ozonation.

    PubMed

    Smeets, P W M H; Dullemont, Y J; Van Gelder, P H A J M; Van Dijk, J C; Medema, G J

    2008-09-01

    Quantitative microbial risk assessment (QMRA) is increasingly applied to estimate drinking water safety. In QMRA the risk of infection is calculated from pathogen concentrations in drinking water, water consumption and dose response relations. Pathogen concentrations in drinking water are generally low and monitoring provides little information for QMRA. Therefore pathogen concentrations are monitored in the raw water and reduction of pathogens by treatment is modelled stochastically with Monte Carlo simulations. The method was tested in a case study with Campylobacter monitoring data of rapid sand filtration and ozonation processes. This study showed that the currently applied method did not predict the monitoring data used for validation. Consequently the risk of infection was over estimated by one order of magnitude. An improved method for model validation was developed. It combines non-parametric bootstrapping with statistical extrapolation to rare events. Evaluation of the treatment model was improved by presenting monitoring data and modelling results in CCDF graphs, which focus on the occurrence of rare events. Apart from calculating the yearly average risk of infection, the model results were presented in FN curves. This allowed for evaluation of both the distribution of risk and the uncertainty associated with the assessment.

  7. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  8. Murine model of disseminated fusariosis: evaluation of the fungal burden by traditional CFU and quantitative PCR.

    PubMed

    González, Gloria M; Márquez, Jazmín; Treviño-Rangel, Rogelio de J; Palma-Nicolás, José P; Garza-González, Elvira; Ceceñas, Luis A; Gerardo González, J

    2013-10-01

    Systemic disease is the most severe clinical form of fusariosis, and the treatment involves a challenge due to the refractory response to antifungals. Treatment for murine Fusarium solani infection has been described in models that employ CFU quantitation in organs as a parameter of therapeutic efficacy. However, CFU counts do not precisely reproduce the amount of cells for filamentous fungi such as F. solani. In this study, we developed a murine model of disseminated fusariosis and compared the fungal burden with two methods: CFU and quantitative PCR. ICR and BALB/c mice received an intravenous injection of 1 × 10(7) conidia of F. solani per mouse. On days 2, 5, 7, and 9, mice from each mice strain were killed. The spleen and kidneys of each animal were removed and evaluated by qPCR and CFU determinations. Results from CFU assay indicated that the spleen and kidneys had almost the same fungal burden in both BALB/c and ICR mice during the days of the evaluation. In the qPCR assay, the spleen and kidney of each mouse strain had increased fungal burden in each determination throughout the entire experiment. The fungal load determined by the qPCR assay was significantly greater than that determined from CFU measurements of tissue. qPCR could be considered as a tool for quantitative evaluation of fungal burden in experimental disseminated F. solani infection.

  9. Seismic risk evaluation aided by IR thermography

    NASA Astrophysics Data System (ADS)

    Grinzato, E.; Cadelano, G.; Bison, P.; Petracca, A.

    2009-05-01

    Conservation of buildings in areas at seismic risk must take prevention into account. The safeguard architectonic heritage is an ambitious objective, but a priority for planning programmes at varying levels of decision making. Preservation and restoration activities must be optimized to cover a vast and widespread historical and architectonic heritage present in many countries. Masonry buildings requires an adequate level of knowledge based on the importance of structural geometry, which may include the damage, details of construction and properties of materials. For identification and classification of masonry is necessary to find shape, type and size of the elements, texture, size of mortar joints, assemblage. The recognition can be done through a visual inspection of the surface of walls, which can be examined, where is not visible, removing a layer of plaster. Thermography is an excellent tool for a fast survey and collection of vital information for this purpose, but it is extremely important define a precise procedure in the development of more efficient monitoring tools. Thermography is a non-destructive method that allows recognizing the structural damage below plaster, detecting the presence of discontinuity in masonry, for added storeys, cavity, filled openings, and repairs. Furthermore, the fast identification of subsurface state allows to select areas where other methods either more penetrating or partially destructive have to be applied. The paper reports experimental results achieved in the mainframe of the European project RECES Modiquus. The main aim of the project is to improve methods, techniques and instruments for facing antiseismic options. Both passive and active thermographic techniques have been applied in different weather conditions and time schemes. A dedicated algorithm has been developed to enhance the visibility of wall bonding.

  10. Risk assessment for transboundary rivers using fuzzy synthetic evaluation technique

    NASA Astrophysics Data System (ADS)

    Rai, Subash P.; Sharma, Nayan; Lohani, A. K.

    2014-11-01

    Large scale urbanization has resulted in greater withdrawals of shared waters and this withdrawal has been largely dependent on the hegemony of the riparian's. The last few decades has seen the upward surge of many countries in terms of development as well as hegemony. Existing structures of established water sharing framework typically evaluate only parameters related to historic water use such as historic water demand and supply, contribution to flow, and hydrology. Water conflicts and cooperation is affected by various issues related with development and hegemony. Characterization and quantification of development and hegemony parameters is a very complex process. This paper establishes a novel approach to predict river basins at risk; the approach addresses the issue of water conflict and cooperation within a methodologically more rigorous predictive framework. Fuzzy synthetic evaluation technique is used in this paper to undertake the risk assessment of international transboundary rivers. In this paper the fuzzy domain of risk consists of two fuzzy sets - hegemony and development, indices of which are developed with the help of fuzzy synthetic evaluation techniques. Then the compositional rule-base is framed to ascertain the fuzzy risk. This fuzzy risk can be further used to prioritize all the international river basins which can help in the identification of potentially high risk basins. Risk identification of international river basins is not only scientifically valuable, but also practically highly useful. Identifying those basins that are likely to be particularly prone to conflict or cooperation is of high interest to policy makers.

  11. Risk-Based Evaluation of Total Petroleum Hydrocarbons in Vapor Intrusion Studies

    PubMed Central

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-01-01

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum. PMID:23765191

  12. Risk-based evaluation of total petroleum hydrocarbons in vapor intrusion studies.

    PubMed

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-06-13

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum.

  13. An overview of BWR Mark-1 containment venting risk implications: An evaluation of potential Mark-1 containment improvements

    SciTech Connect

    Wagner, K.C.; Dallman, R.J.; Galyean, W.J.

    1989-06-01

    This report supplements containment venting risk evaluations performed for the Mark-I Containment Performance Improvement (CPI) Program. Quantitative evaluations using simplified containment event trees for station blackout sequences were performed to evaluate potential risk reduction offered by containment venting, and improved automatic depressurization system with a dedicated power source, and an additional supply of water to either the containment sprays or the vessel with a dedicated power source. The risk calculations were based on the Draft NUREG-1150 results for Peach Bottom with selected enhancements. Several sensitivity studies were performed to investigate phenomenological, operational, and equipment performance uncertainties. Qualitative risk evaluations were provided for loss of long-term containment heat removal and anticipated transients without scram for the same set of improvements. A limited discussion is provided on the generic applicability of these results to other plants with Mark-I containments. 23 refs., 15 figs., 13 tabs.

  14. Development and evaluation of a genetic risk score for obesity.

    PubMed

    Belsky, Daniel W; Moffitt, Terrie E; Sugden, Karen; Williams, Benjamin; Houts, Renate; McCarthy, Jeanette; Caspi, Avshalom

    2013-01-01

    Multi-locus profiles of genetic risk, so-called "genetic risk scores," can be used to translate discoveries from genome-wide association studies into tools for population health research. We developed a genetic risk score for obesity from results of 16 published genome-wide association studies of obesity phenotypes in European-descent samples. We then evaluated this genetic risk score using data from the Atherosclerosis Risk in Communities (ARIC) cohort GWAS sample (N = 10,745, 55% female, 77% white, 23% African American). Our 32-locus GRS was a statistically significant predictor of body mass index (BMI) and obesity among ARIC whites [for BMI, r = 0.13, p<1 × 10(-30); for obesity, area under the receiver operating characteristic curve (AUC) = 0.57 (95% CI 0.55-0.58)]. The GRS predicted differences in obesity risk net of demographic, geographic, and socioeconomic information. The GRS performed less well among African Americans. The genetic risk score we derived from GWAS provides a molecular measurement of genetic predisposition to elevated BMI and obesity.[Supplemental materials are available for this article. Go to the publisher's online edition of Biodemography and Social Biology for the following resource: Supplement to Development & Evaluation of a Genetic Risk Score for Obesity.].

  15. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    PubMed

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts.

  16. Quantitative trait loci that modulate trabecular bone's risk of failure during unloading and reloading.

    PubMed

    Ozcivici, Engin; Zhang, Weidong; Donahue, Leah Rae; Judex, Stefan

    2014-07-01

    Genetic makeup of an individual is a strong determinant of the morphologic and mechanical properties of bone. Here, in an effort to identify quantitative trait loci (QTLs) for changes in the simulated mechanical parameters of trabecular bone during altered mechanical demand, we subjected 352 second generation female adult (16 weeks old) BALBxC3H mice to 3 weeks of hindlimb unloading followed by 3 weeks of reambulation. Longitudinal in vivo microcomputed tomography (μCT) scans tracked trabecular changes in the distal femur. Tomographies were directly translated into finite element (FE) models and subjected to a uniaxial compression test. Apparent trabecular stiffness and components of the Von Mises (VM) stress distributions were computed for the distal metaphysis and associated with QTLs. At baseline, five QTLs explained 20% of the variation in trabecular peak stresses across the mouse population. During unloading, three QTLs accounted for 14% of the variability in peak stresses. During reambulation, one QTL accounted for 5% of the variability in peak stresses. QTLs were also identified for mechanically induced changes in stiffness, median stress values and skewness of stress distributions. There was little overlap between QTLs identified for baseline and QTLs for longitudinal changes in mechanical properties, suggesting that distinct genes may be responsible for the mechanical response of trabecular bone. Unloading related QTLs were also different from reambulation related QTLs. Further, QTLs identified here for mechanical properties differed from previously identified QTLs for trabecular morphology, perhaps revealing novel gene targets for reducing fracture risk in individuals exposed to unloading and for maximizing the recovery of trabecular bone's mechanical properties during reambulation.

  17. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  18. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  19. Quantitative AOP-based predictions for two aromatase inhibitors evaluating the influence of bioaccumulation on prediction accuracy

    EPA Science Inventory

    The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...

  20. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  1. CDER risk assessment exercise to evaluate potential risks from the use of nanomaterials in drug products.

    PubMed

    Cruz, Celia N; Tyner, Katherine M; Velazquez, Lydia; Hyams, Kenneth C; Jacobs, Abigail; Shaw, Arthur B; Jiang, Wenlei; Lionberger, Robert; Hinderling, Peter; Kong, Yoon; Brown, Paul C; Ghosh, Tapash; Strasinger, Caroline; Suarez-Sharp, Sandra; Henry, Don; Van Uitert, Maat; Sadrieh, Nakissa; Morefield, Elaine

    2013-07-01

    The Nanotechnology Risk Assessment Working Group in the Center for Drug Evaluation and Research (CDER) within the United States Food and Drug Administration was established to assess the possible impact of nanotechnology on drug products. The group is in the process of performing risk assessment and management exercises. The task of the working group is to identify areas where CDER may need to optimize its review practices and to develop standards to ensure review consistency for drug applications that may involve the application of nanotechnology. The working group already performed risk management exercises evaluating the potential risks from administering nanomaterial active pharmaceutical ingredients (API) or nanomaterial excipients by various routes of administration. This publication outlines the risk assessment and management process used by the working group, using nanomaterial API by the oral route of administration as an example.

  2. Evaluation of Enhanced Risk Monitors for Use on Advanced Reactors

    SciTech Connect

    Ramuhalli, Pradeep; Veeramany, Arun; Bonebrake, Christopher A.; Ivans, William J.; Coles, Garill A.; Hirt, Evelyn H.

    2016-09-26

    This study provides an overview of the methodology for integrating time-dependent failure probabilities into nuclear power reactor risk monitors. This prototypic enhanced risk monitor (ERM) methodology was evaluated using a hypothetical probabilistic risk assessment (PRA) model, generated using a simplified design of a liquid-metal-cooled advanced reactor (AR). Component failure data from industry compilation of failures of components similar to those in the simplified AR model were used to initialize the PRA model. Core damage frequency (CDF) over time were computed and analyzed. In addition, a study on alternative risk metrics for ARs was conducted. Risk metrics that quantify the normalized cost of repairs, replacements, or other operations and management (O&M) actions were defined and used, along with an economic model, to compute the likely economic risk of future actions such as deferred maintenance based on the anticipated change in CDF due to current component condition and future anticipated degradation. Such integration of conventional-risk metrics with alternate-risk metrics provides a convenient mechanism for assessing the impact of O&M decisions on safety and economics of the plant. It is expected that, when integrated with supervisory control algorithms, such integrated-risk monitors will provide a mechanism for real-time control decision-making that ensure safety margins are maintained while operating the plant in an economically viable manner.

  3. A Risk-Based Approach to Test and Evaluation

    DTIC Science & Technology

    2012-05-01

    is it to occur (probability, frequency), and what will be the outcome (consequences)? The SAPHIRE software tool also is introduced as a way to...develop those risk concepts dealing with event trees, fault trees, and desired end states. SAPHIRE is a probabilistic risk, and reliability assessment...software tool. SAPHIRE stands for Systems Analysis Programs for Hands-on Integrated Reliability Evaluations and was developed for the U.S. Nuclear

  4. Quantitative microbial risk assessment for Escherichia coli O157 on lettuce, based on survival data from controlled studies in a climate chamber.

    PubMed

    Ottoson, Jakob R; Nyberg, Karin; Lindqvist, Roland; Albihn, Ann

    2011-12-01

    The aims of the study were to determine the survival of Escherichia coli O157 on lettuce as a function of temperature and light intensity, and to use that information in a screening-level quantitative microbial risk assessment (QMRA) in order to evaluate risk-reducing strategies including irrigation water quality guidelines, rinsing, and holding time between last irrigation and harvest. Iceberg lettuce was grown in a climate chamber and inoculated with E. coli O157. Bacterial numbers were determined with the standard plate count method after inoculation and 1, 2, 4, and 7 day(s) postinoculation. The experiments were carried out at 11, 18, and 25°C in light intensities of 0, 400, and 600 mmol (m(2))(-1) s(-1). There was a significant effect of temperature and light intensity on survival, with less bacteria isolated from lettuce incubated at 25 and 18°C compared with 11°C (P < 0.0001), and in light intensities of 400 and 600 mmol (m(2))(-1) s(-1) compared with 0 mmol (m(2))(-1) s(-1) (P < 0.001). The average log reductions after 1, 2, 4, and 7 day(s) were 1.14, 1.71, 2.04, and 3.0, respectively. The QMRA compared the relative risk with lettuce consumption from 20 scenarios. A stricter water quality guideline gave a mean fivefold risk reduction. Holding times of 1, 2, 4, and 7 day(s) reduced the risk 3, 8, 8, and 18 times, respectively, compared with harvest the same day as the last irrigation. Finally, rinsing lettuce for 15 s in cold tap water prior to consumption gave a sixfold risk reduction compared with eating unrinsed lettuce. Sensitivity analyses indicated that variation in bacterial inactivation had the most significant effect on the risk outcome. A QMRA determining the relative risks between scenarios reduces uncertainty and can provide risk managers with decision support.

  5. Multiple component quantitative analysis for the pattern recognition and quality evaluation of Kalopanacis Cortex using HPLC.

    PubMed

    Men, Chu Van; Jang, Yu Seon; Lee, Kwan Jun; Lee, Jae Hyun; Quang, Tran Hong; Long, Nguyen Van; Luong, Hoang Van; Kim, Young Ho; Kang, Jong Seong

    2011-12-01

    A quantitative and pattern recognition analyses were conducted for quality evaluation of Kalopanacis Cortex (KC) using HPLC. For quantitative analysis, four bioactive compounds, liriodendrin, pinoresinol O-β-D-glucopyranoside, acanthoside B and kalopanaxin B, were determined. The analysis method was optimized and validated using ODS column with mobile phase of methanol and aqueous phosphoric acid. The validation gave acceptable linearities (r > 0.9995), recoveries (98.4% to 101.9%) and precisions (RSD < 2.20). The limit of detection of compounds ranged from 0.4 to 0.9 μg/mL. Among the four compounds, liriodendrin was recommended as a marker compound for the quality control of KC. The pattern analysis was successfully carried out by analyzing thirty two samples from four species, and the authentic KC samples were completely discriminated from other inauthentic species by linear discriminant analysis. The results indicated that the method was suitable for the quantitative analysis of liriodendrin and the quality evaluation of KC.

  6. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective.

  7. Evaluation of the predictability of real-time crash risk models.

    PubMed

    Xu, Chengcheng; Liu, Pan; Wang, Wei

    2016-09-01

    The primary objective of the present study was to investigate the predictability of crash risk models that were developed using high-resolution real-time traffic data. More specifically the present study sought answers to the following questions: (a) how to evaluate the predictability of a real-time crash risk model; and (b) how to improve the predictability of a real-time crash risk model. The predictability is defined as the crash probability given the crash precursor identified by the crash risk model. An equation was derived based on the Bayes' theorem for estimating approximately the predictability of crash risk models. The estimated predictability was then used to quantitatively evaluate the effects of the threshold of crash precursors, the matched and unmatched case-control design, and the control-to-case ratio on the predictability of crash risk models. It was found that: (a) the predictability of a crash risk model can be measured as the product of prior crash probability and the ratio between sensitivity and false alarm rate; (b) there is a trade-off between the predictability and sensitivity of a real-time crash risk model; (c) for a given level of sensitivity, the predictability of the crash risk model that is developed using the unmatched case-controlled sample is always better than that of the model developed using the matched case-controlled sample; and (d) when the control-to-case ratio is beyond 4:1, the increase in control-to-case ratio does not lead to clear improvements in predictability.

  8. Perception of risks from electromagnetic fields: A psychometric evaluation of a risk-communication approach

    SciTech Connect

    MacGregor, D.G.; Slovic, P. ); Morgan, M.G. )

    1994-10-01

    Potential health risks from exposure to power-frequency electromagnetic fields (EMF) have become an issue of significant public concern. This study evaluates a brochure designed to communicate EMF health risks from a scientific perspective. The study utilized a pretest-posttest design in which respondents judged various sources of EMF (and other) health and safety risks, both before reaching the brochure and after. Respondents assessed risks on dimensions similar to those utilized in previous studies of risk perception. In addition, detailed ratings were made that probed respondents' beliefs about the possible causal effects of EMF exposure. The findings suggest that naive beliefs about the potential of EMF exposure to cause harm were highly influenced by specific content elements of the brochure. The implications for using risk-communication approaches based on communicating scientific uncertainty are discussed. 19 refs., 1 fig., 11 tabs.

  9. Conceptual Model of Offshore Wind Environmental Risk Evaluation System

    SciTech Connect

    Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.

    2010-06-01

    In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.

  10. Studying the effects of POs and MCs on the Salmonella ALOP with a quantitative risk assessment model for beef production.

    PubMed

    Tuominen, Pirkko; Ranta, Jukka; Maijala, Riitta

    2007-08-15

    The Finnish Salmonella Control Programme and the special guarantees (SG) of import concerning Salmonella in the beef production chain were examined within the risk analysis framework. The appropriate level of protection (ALOP de facto since not referred to as ALOP in regulation), performance objectives (PO), and microbiological criteria (MC) were identified along the beef production chain. A quantitative microbiological risk assessment (QMRA) model using the Bayesian probabilistic method was developed for the beef chain to evaluate the capability of different POs to contribute to the ALOP. The influence of SGs was studied as an intervention protecting Finnish consumers. The QMRA made it possible to translate an ALOP without a stated food safety objective (FSO) to POs when implemented for both ready-to-eat (RTE) and non-RTE products. According to the results, the Finnish ALOP de facto for beef, beef preparations and products (10 human Salmonella cases/100,000) was reached in all of the years 1996-2004. However, if the prevalence at the slaughter, domestic cut beef, and retail levels would increase to the level of POs set (maximum 1%), the ALOP de facto would be exceeded by a factor of roughly two. On the other hand, the zero tolerance applied to MCs would keep the true Salmonella prevalence at production steps with POs clearly below 1%, and the ALOP would then be achievable. The influence of SGs on the total exposure was so small (average 0.1% added to the total prevalence of beef-derived foods at retail) that their relevance may be doubted with the current amount and Salmonella prevalence in beef-derived imports. On the other hand, a change in import profile could increase the protective effect of the SGs. Although practical follow-up has to be carried out as apparent prevalences, the objectives and criteria should be estimated as true prevalences and incidences with quantified uncertainties in order to achieve a sound, transparent scientific-based understanding of

  11. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  12. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  13. Evaluation of the Quantitative Prediction of a Trend Reversal on the Japanese Stock Market in 1999

    NASA Astrophysics Data System (ADS)

    Johansen, Anders; Sornette, Didier

    In January 1999, the authors published a quantitative prediction that the Nikkei index should recover from its 14-year low in January 1999 and reach ~20 500 a year later. The purpose of the present paper is to evaluate the performance of this specific prediction as well as the underlying model: the forecast, performed at a time when the Nikkei was at its lowest (as we can now judge in hindsight), has correctly captured the change of trend as well as the quantitative evolution of the Nikkei index since its inception. As the change of trend from sluggish to recovery was estimated quite unlikely by many observers at that time, a Bayesian analysis shows that a skeptical (resp. neutral) Bayesian sees prior belief in our model amplified into a posterior belief 19 times larger (resp. reach the 95% level).

  14. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  15. Possible roles for quantitative risk assessment (QRA) in the field of health care and health care regulation

    NASA Astrophysics Data System (ADS)

    Garrick, B. John; Dykes, Andrew A.; Kaplan, Stan

    1995-10-01

    The discipline of risk assessment has developed primarily over the last two decades out of two principal public fears -- radiation and cancer. As a discipline, it has become an integral part of regulatory reform and management science. Three different cultures have been the primary practitioners of the risk analysis discipline: (1) engineers as applied to engineered systems, (2) health scientists primarily in relation to environmental impacts, and (3) social scientists with respect to public participation and societal threats. The engineers and physical scientists have been the most active in developing the discipline of probabilistic risk assessment -- or, as it is increasingly referred to, quantitative risk assessment (QRA). It is QRA that has been our field of interest for the past two decades. The purpose of this summary is to define QRA, suggest a possible role in the health care field, and to make some observations about QRA, health care, and government regulations.

  16. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    PubMed

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant.

  17. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers--specific application to Listeria monocytogenes and ready-to-eat meat products.

    PubMed

    Mataragas, M; Zwietering, M H; Skandamis, P N; Drosinos, E H

    2010-07-31

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH=6.2-6.4 and water activity=0.98-0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combination.

  18. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska.

    PubMed

    Harwell, Mark A; Gentile, John H; Johnson, Charles B; Garshelis, David L; Parker, Keith R

    2010-07-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001-2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI.

  19. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  20. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (Cf) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined Cf for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  1. Assembling of Fluid Filtration System for Quantitative Evaluation of Microleakage in Dental Materials

    PubMed Central

    Javidi, Maryam; Naghavi, Neda; Roohani, Ehsan

    2008-01-01

    INTRODUCTION: There are several methods for evaluating microleakage in dentistry, for example dye or bacterial leakage, electro-chemical methods, radioisotope labeling and fluid filtration. The purpose of this study was to assemble the fluid filtration system for quantitative evaluation of microleakage in dental materials. MATERIALS AND METHODS: The roots were connected to a tube filled with an underwater pressure supply. A bubble was introduced into the water to measure endodontic leakage. A digital camera and professional software were utilized to record and measure the bubble displacement. RESULTS: Our system was constructed successfully and functioned correctly. CONCLUSION: In this pilot study we found this system efficient for the evaluation of microleakage of dental materials. PMID:24146673

  2. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  3. Genetic algorithm based image binarization approach and its quantitative evaluation via pooling

    NASA Astrophysics Data System (ADS)

    Hu, Huijun; Liu, Ya; Liu, Maofu

    2015-12-01

    The binarized image is very critical to image visual feature extraction, especially shape feature, and the image binarization approaches have been attracted more attentions in the past decades. In this paper, the genetic algorithm is applied to optimizing the binarization threshold of the strip steel defect image. In order to evaluate our genetic algorithm based image binarization approach in terms of quantity, we propose the novel pooling based evaluation metric, motivated by information retrieval community, to avoid the lack of ground-truth binary image. Experimental results show that our genetic algorithm based binarization approach is effective and efficiency in the strip steel defect images and our quantitative evaluation metric on image binarization via pooling is also feasible and practical.

  4. Clinical evaluator reliability for quantitative and manual muscle testing measures of strength in children.

    PubMed

    Escolar, D M; Henricson, E K; Mayhew, J; Florence, J; Leshner, R; Patel, K M; Clemens, P R

    2001-06-01

    Measurements of muscle strength in clinical trials of Duchenne muscular dystrophy have relied heavily on manual muscle testing (MMT). The high level of intra- and interrater variability of MMT compromises clinical study results. We compared the reliability of 12 clinical evaluators in performing MMT and quantitative muscle testing (QMT) on 12 children with muscular dystrophy. QMT was reliable, with an interclass correlation coefficient (ICC) of >0.9 for biceps and grip strength, and >0.8 for quadriceps strength. Training of both subjects and evaluators was easily accomplished. MMT was not as reliable, and required repeated training of evaluators to bring all groups to an ICC >0.75 for shoulder abduction, elbow and hip flexion, knee extension, and ankle dorsiflexion. We conclude that QMT shows greater reliability and is easier to implement than MMT. Consequently, QMT will be a superior measure of strength for use in pediatric, neuromuscular, multicenter clinical trials.

  5. Quantitative risk assessment of listeriosis due to consumption of raw milk.

    PubMed

    Latorre, Alejandra A; Pradhan, Abani K; Van Kessel, Jo Ann S; Karns, Jeffrey S; Boor, Kathryn J; Rice, Daniel H; Mangione, Kurt J; Gröhn, Yrjo T; Schukken, Ynte H

    2011-08-01

    The objectives of this study were to estimate the risk of illness for raw milk consumers due to Listeria monocytogenes in raw milk sold by permitted dealers, and the risk for people on farms who consume raw milk. Three scenarios were evaluated for raw milk sold by dealers: raw milk purchased directly from bulk tanks, from on-farm stores, and from retail. To assess the effect of mandatory testing of raw milk by regulatory agencies, the number of listeriosis cases per year was compared where no raw milk testing was done, only a screening test to issue a permit was conducted, and routine testing was conducted and milk was recalled if it was L. monocytogenes positive. The median number of listeriosis cases associated with consumption of raw milk from bulk tanks, farm stores, and retail for an intermediate-age population was 6.6 × 10(-7), 3.8 × 10(-5), and 5.1 × 10(-5) cases per year, respective ly. In populations with high susceptibility, the estimated median number of cases per year was 2.7 × 10(-7) (perinatal, i.e., pregnant women and their fetuses or newborns) and 1.4 × 10(-6) (elderly) for milk purchased from bulk tanks, 1.5 × 10(-5 ) (perinatal) and 7.8 × 10(-5) (elderly) for milk from farm stores, and 2.1 × 10(-5) (perinatal) and 1.0 × 10(-4) (elderly) for milk from retail. For raw milk consumed on farms, the median number of listeriosis cases was 1.4 × 10(-7) cases per year. A greater risk of listeriosis was associated with consumption of raw milk obtained from retail and farm stores as compared with milk obtained from bulk tanks. This was likely due to additional time-temperature combination steps in the retail and farm store models, which increased the chances for growth of L. monocytogenes in raw milk. A close relationship between prevalence of L. monocytogenes in raw milk and the values of disease incidence was observed. Hence, a reduction in the number of cases per year in all populations was observed when a raw milk-testing program was in place

  6. Risk evaluation of liquefaction on the site of Damien (Haiti)

    NASA Astrophysics Data System (ADS)

    Jean, B. J.; Boisson, D.; Thimus, J.; Schroeder, C.

    2013-12-01

    Under the proposed relocation of all faculties to the campus of Damien, owned by Université d'Etat d'Haïti (UEH), the Unité de Recherche en Géotechnique (URGéo) of the Faculté des Sciences (FDS) of UEH conducted several operations whose objective was to evaluate the risk of liquefaction on this site. This abstract presents a comprehensive and coherent manner the entire processus of assessing the risk of liquefaction. This evaluation was conducted mainly from seismic thechniques, laboratory tests and the response of a one-dimensional soil column. Then, we summarize the results of this evaluation on the various techniques through synthetic maps interpretations of MASW 1D and H/V and also measures on site response to seismic loading from the SPT test applied to evaluation of liquefaction potential.

  7. Quantitative risk assessment from farm to fork and beyond: a global Bayesian approach concerning food-borne diseases.

    PubMed

    Albert, Isabelle; Grenier, Emmanuel; Denis, Jean-Baptiste; Rousseau, Judith

    2008-04-01

    A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.

  8. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  9. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  10. A quantitative approach to evaluate image quality of whole slide imaging scanners

    PubMed Central

    Shrestha, Prarthana; Kneepkens, R.; Vrijnsen, J.; Vossen, D.; Abels, E.; Hulsken, B.

    2016-01-01

    Context: The quality of images produced by whole slide imaging (WSI) scanners has a direct influence on the readers’ performance and reliability of the clinical diagnosis. Therefore, WSI scanners should produce not only high quality but also consistent quality images. Aim: We aim to evaluate reproducibility of WSI scanners based on the quality of images produced over time and among multiple scanners. The evaluation is independent of content or context of test specimen. Methods: The ultimate judge of image quality is a pathologist, however, subjective evaluations are heavily influenced by the complexity of a case and subtle variations introduced by a scanner can be easily overlooked. Therefore, we employed a quantitative image quality assessment method based on clinically relevant parameters, such as sharpness and brightness, acquired in a survey of pathologists. The acceptable level of quality per parameter was determined in a subjective study. The evaluation of scanner reproducibility was conducted with Philips Ultra-Fast Scanners. A set of 36 HercepTest™ slides were used in three sub-studies addressing variations due to systems and time, producing 8640 test images for evaluation. Results: The results showed that the majority of images in all the sub-studies are within the acceptable quality level; however, some scanners produce higher quality images more often than others. The results are independent of case types, and they match our perception of quality. Conclusion: The quantitative image quality assessment method was successfully applied in the HercepTest™ slides to evaluate WSI scanner reproducibility. The proposed method is generic and applicable to any other types of slide stains and scanners. PMID:28197359

  11. EQUIFAT: A novel scoring system for the semi-quantitative evaluation of regional adipose tissues in Equidae

    PubMed Central

    Morrison, Philippa K.; Harris, Patricia A.; Maltin, Charlotte A.; Grove-White, Dai; Argo, Caroline McG.

    2017-01-01

    Anatomically distinct adipose tissues represent variable risks to metabolic health in man and some other mammals. Quantitative-imaging of internal adipose depots is problematic in large animals and associations between regional adiposity and health are poorly understood. This study aimed to develop and test a semi-quantitative system (EQUIFAT) which could be applied to regional adipose tissues. Anatomically-defined, photographic images of adipose depots (omental, mesenteric, epicardial, rump) were collected from 38 animals immediately post-mortem. Images were ranked and depot-specific descriptors were developed (1 = no fat visible; 5 = excessive fat present). Nuchal-crest and ventro-abdominal-retroperitoneal adipose depot depths (cm) were transformed to categorical 5 point scores. The repeatability and reliability of EQUIFAT was independently tested by 24 observers. When half scores were permitted, inter-observer agreement was substantial (average κw: mesenteric, 0.79; omental, 0.79; rump 0.61) or moderate (average κw; epicardial, 0.60). Intra-observer repeatability was tested by 8 observers on 2 occasions. Kappa analysis indicated perfect (omental and mesenteric) and substantial agreement (epicardial and rump) between attempts. A further 207 animals were evaluated ante-mortem (age, height, breed-type, gender, body condition score [BCS]) and again immediately post-mortem (EQUIFAT scores, carcass weight). Multivariable, random effect linear regression models were fitted (breed as random effect; BCS as outcome variable). Only height, carcass weight, omental and retroperitoneal EQUIFAT scores remained as explanatory variables in the final model. The EQUIFAT scores developed here demonstrate clear functional differences between regional adipose depots and future studies could be directed towards describing associations between adiposity and disease risk in surgical and post-mortem situations. PMID:28296956

  12. Evaluating Immunogenicity Risk Due to Host Cell Protein Impurities in Antibody-Based Biotherapeutics.

    PubMed

    Jawa, Vibha; Joubert, Marisa K; Zhang, Qingchun; Deshpande, Meghana; Hapuarachchi, Suminda; Hall, Michael P; Flynn, Gregory C

    2016-11-01

    A potential risk factor for immunogenicity of a biotherapeutic is the low levels of host cell protein (HCP) impurities that remain in the product following the purification process. During process development, significant attention has been devoted to removing HCPs due to their potential safety risk. Samples from different purification steps of several monoclonal antibodies (mAbs) purified by one type of platform were evaluated for their residual Chinese Hamster Ovary (CHO) cell-derived HCP content. HCPs in both in-process (high levels of HCP) and highly purified (low levels of HCP) samples were identified and quantitated by proteomic analysis via mass spectrometry. The responses to HCPs were evaluated in an in vitro assay using PBMC from a population of healthy and disease state individuals. Results indicated that samples with up to 4000 ppm HCP content (levels 200 times greater than the drug substance) did not pose a higher immunogenicity risk than highly purified mAb samples. As an orthogonal method to predict immunogenicity risk, in silico algorithms that probe amino acid sequence for foreign epitope content were used to evaluate over 20 common HCPs (identified in the different mAb samples). Only a few HCPs were identified as high risk by the algorithms; however, the in vitro assay results indicated that the concentration of these HCPs from in-process biotherapeutic mAb samples was not sufficient to stimulate an immune response. This suggests that high levels of HCP in mAb biotherapeutics purified by this type of platform do not increase the potential risk of immunogenicity of these molecules. Insights from these studies can be applied to HCP control and risk assessment strategies.

  13. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  14. Toward objective and quantitative evaluation of imaging systems using images of phantoms.

    PubMed

    Gagne, Robert M; Gallas, Brandon D; Myers, Kyle J

    2006-01-01

    The use of imaging phantoms is a common method of evaluating image quality in the clinical setting. These evaluations rely on a subjective decision by a human observer with respect to the faintest detectable signal(s) in the image. Because of the variable and subjective nature of the human-observer scores, the evaluations manifest a lack of precision and a potential for bias. The advent of digital imaging systems with their inherent digital data provides the opportunity to use techniques that do not rely on human-observer decisions and thresholds. Using the digital data, signal-detection theory (SDT) provides the basis for more objective and quantitative evaluations which are independent of a human-observer decision threshold. In a SDT framework, the evaluation of imaging phantoms represents a "signal-known-exactly/background-known-exactly" ("SKE/ BKE") detection task. In this study, we compute the performance of prewhitening and nonprewhitening model observers in terms of the observer signal-to-noise ratio (SNR) for these "SK E/BKE" tasks. We apply the evaluation methods to a number of imaging systems. For example, we use data from a laboratory implementation of digital radiography and from a full-field digital mammography system in a clinical setting. In addition, we make a comparison of our methods to human-observer scoring of a set of digital images of the CDMAM phantom available from the internet (EUREF-European Reference Organization). In the latter case, we show a significant increase in the precision of the quantitative methods versus the variability in the scores from human observers on the same set of images. As regards bias, the performance of a model observer estimated from a finite data set is known to be biased. In this study, we minimize the bias and estimate the variance of the observer SNR using statistical resampling techniques, namely, "bootstrapping" and "shuffling" of the data sets. Our methods provide objective and quantitative evaluation of

  15. Experimental approaches for evaluating the invasion risk of biofuel crops

    NASA Astrophysics Data System (ADS)

    Flory, S. Luke; Lorentz, Kimberly A.; Gordon, Doria R.; Sollenberger, Lynn E.

    2012-12-01

    There is growing concern that non-native plants cultivated for bioenergy production might escape and result in harmful invasions in natural areas. Literature-derived assessment tools used to evaluate invasion risk are beneficial for screening, but cannot be used to assess novel cultivars or genotypes. Experimental approaches are needed to help quantify invasion risk but protocols for such tools are lacking. We review current methods for evaluating invasion risk and make recommendations for incremental tests from small-scale experiments to widespread, controlled introductions. First, local experiments should be performed to identify conditions that are favorable for germination, survival, and growth of candidate biofuel crops. Subsequently, experimental introductions in semi-natural areas can be used to assess factors important for establishment and performance such as disturbance, founder population size, and timing of introduction across variable habitats. Finally, to fully characterize invasion risk, experimental introductions should be conducted across the expected geographic range of cultivation over multiple years. Any field-based testing should be accompanied by safeguards and monitoring for early detection of spread. Despite the costs of conducting experimental tests of invasion risk, empirical screening will greatly improve our ability to determine if the benefits of a proposed biofuel species outweigh the projected risks of invasions.

  16. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  17. Quantitative evaluation of registration methods for atlas-based diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Wu, Xue; Eggebrecht, Adam T.; Culver, Joseph P.; Zhan, Yuxuan; Basevi, Hector; Dehghani, Hamid

    2013-06-01

    In Diffuse Optical Tomography (DOT), an atlas-based model can be used as an alternative to a subject-specific anatomical model for recovery of brain activity. The main step of the generation of atlas-based subject model is the registration of atlas model to the subject head. The accuracy of the DOT then relies on the accuracy of registration method. In this work, 11 registration methods are quantitatively evaluated. The registration method with EEG 10/20 systems with 19 landmarks and non-iterative point to point algorithm provides approximately 1.4 mm surface error and is considered as the most efficient registration method.

  18. Quantitative evaluation of annular bright-field phase images in STEM.

    PubMed

    Ishida, Takafumi; Kawasaki, Tadahiro; Tanji, Takayoshi; Ikuta, Takashi

    2015-04-01

    A phase reconstruction method based on multiple scanning transmission electron microscope (STEM) images was evaluated quantitatively using image simulations. The simulation results indicated that the phase shift caused by a single atom was proportional to the 0.6th power of the atomic number Z. For a thin SrTiO3 [001] crystal, the reconstructed phase at each atomic column increased according to the specimen thickness. The STEM phase images can quantify the oxygen vacancy concentration if the thickness is less than several nanometers.

  19. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  20. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  1. A probabilistic quantitative microbial risk assessment model of norovirus disease burden from wastewater irrigation of vegetables in Shepparton, Australia.

    PubMed

    Mok, Hoi-Fei; Barker, S Fiona; Hamilton, Andrew J

    2014-05-01

    Wastewater can be an important resource for water-scarce regions of the world, but a major barrier to its use is the associated health risk. Quantitative microbial risk assessment (QMRA) is a probabilistic modeling technique used to determine the health risks from wastewater reuse, but only a handful of QMRA studies have examined the norovirus health risks from consumption of vegetables irrigated with human wastewater, even though norovirus is a, if not the most, significant microbial cause of diarrheal disease world-wide. Furthermore, the majority of these studies have focused only on risks from lettuce consumption. To meet the knowledge gap in health risks for other vegetables, a QMRA model was constructed for agricultural wastewater irrigation in the regional city of Shepparton, Australia, using fecal shedding rates to estimate norovirus concentration in raw sewage. Annual norovirus disease burden was estimated for the consumption of lettuce, broccoli, cabbage, Asian vegetables, and cucumber after irrigation with treated wastewater. Results indicate that the waste stabilization pond treatment did not have sufficient virus removal to meet the World Health Organization (WHO) threshold for acceptable level of risk for wastewater reuse, but addition of disinfection treatments provided acceptable results for consumption of cucumber and broccoli. This is the first QMRA study to incorporate virus accumulation from previous wastewater irrigation events.

  2. A quantitative assessment of the risk for highly pathogenic avian influenza introduction into Spain via legal trade of live poultry.

    PubMed

    Sánchez-Vizcaíno, Fernando; Perez, Andrés; Lainez, Manuel; Sánchez-Vizcaíno, José Manuel

    2010-05-01

    Highly pathogenic avian influenza (HPAI) is considered one of the most important diseases of poultry. During the last 9 years, HPAI epidemics have been reported in Asia, the Americas, Africa, and in 18 countries of the European Union (EU). For that reason, it is possible that the risk for HPAI virus (HPAIV) introduction into Spain may have recently increased. Because of the EU free-trade policy and because legal trade of live poultry was considered an important route for HPAI spread in certain regions of the world, there are fears that Spain may become HPAIV-infected as a consequence of the legal introduction of live poultry. However, no quantitative assessment of the risk for HPAIV introduction into Spain or into any other EU member state via the trade of poultry has been published in the peer-reviewed literature. This article presents the results of the first quantitative assessment of the risk for HPAIV introduction into a free country via legal trade of live poultry, along with estimates of the geographical variation of the risk and of the relative contribution of exporting countries and susceptible poultry species to the risk. The annual mean risk for HPAI introduction into Spain was estimated to be as low as 1.36 x 10(-3), suggesting that under prevailing conditions, introduction of HPAIV into Spain through the trade of live poultry is unlikely to occur. Moreover, these results support the hypothesis that legal trade of live poultry does not impose a significant risk for the spread of HPAI into EU member states.

  3. Extensions and evaluations of a general quantitative theory of forest structure and dynamics

    PubMed Central

    Enquist, Brian J.; West, Geoffrey B.; Brown, James H.

    2009-01-01

    Here, we present the second part of a quantitative theory for the structure and dynamics of forests under demographic and resource steady state. The theory is based on individual-level allometric scaling relations for how trees use resources, fill space, and grow. These scale up to determine emergent properties of diverse forests, including size–frequency distributions, spacing relations, canopy configurations, mortality rates, population dynamics, successional dynamics, and resource flux rates. The theory uniquely makes quantitative predictions for both stand-level scaling exponents and normalizations. We evaluate these predictions by compiling and analyzing macroecological datasets from several tropical forests. The close match between theoretical predictions and data suggests that forests are organized by a set of very general scaling rules. Our mechanistic theory is based on allometric scaling relations, is complementary to “demographic theory,” but is fundamentally different in approach. It provides a quantitative baseline for understanding deviations from predictions due to other factors, including disturbance, variation in branching architecture, asymmetric competition, resource limitation, and other sources of mortality, which are not included in the deliberately simplified theory. The theory should apply to a wide range of forests despite large differences in abiotic environment, species diversity, and taxonomic and functional composition. PMID:19363161

  4. Qualitative and quantitative evaluation of in vivo SD-OCT measurement of rat brain

    PubMed Central

    Xie, Yijing; Harsan, Laura-Adela; Bienert, Thomas; Kirch, Robert D.; von Elverfeldt, Dominik; Hofmann, Ulrich G.

    2017-01-01

    OCT has been demonstrated as an efficient imaging modality in various biomedical and clinical applications. However, there is a missing link with respect to the source of contrast between OCT and other modern imaging modalities, no quantitative comparison has been demonstrated between them, yet. We evaluated, to our knowledge, for the first time in vivo OCT measurement of rat brain with our previously proposed forward imaging method by both qualitatively and quantitatively correlating OCT with the corresponding T1-weighted and T2-weighted magnetic resonance images, fiber density map (FDM), and two types of histology staining (cresyl violet and acetylcholinesterase AchE), respectively. Brain anatomical structures were identified and compared across OCT, MRI and histology imaging modalities. Noticeable resemblances corresponding to certain anatomical structures were found between OCT and other image profiles. Correlation was quantitatively assessed by estimating correlation coefficient (R) and mutual information (MI). Results show that the 1-D OCT measurements in regards to the intensity profile and estimated attenuation factor, do not have profound linear correlation with the other image modalities suggested from correlation coefficient estimation. However, findings in mutual information analysis demonstrate that there are markedly high MI values in OCT-MRI signals. PMID:28270970

  5. Developing and Evaluating a Cardiovascular Risk Reduction Project.

    ERIC Educational Resources Information Center

    Brownson, Ross C.; Mayer, Jeffrey P.; Dusseault, Patricia; Dabney, Sue; Wright, Kathleen; Jackson-Thompson, Jeannette; Malone, Bernard; Goodman, Robert

    1997-01-01

    Describes the development and baseline evaluation data from the Ozark Heart Health Project, a community-based cardiovascular disease risk reduction program in rural Missouri that targeted smoking, physical inactivity, and poor diet. Several Ozark counties participated in either intervention or control groups, and researchers conducted surveillance…

  6. Rape Prevention with College Men: Evaluating Risk Status

    ERIC Educational Resources Information Center

    Stephens, Kari A.; George, William H.

    2009-01-01

    This study evaluates the effectiveness of a theoretically based rape prevention intervention with college men who were at high or low risk to perpetrate sexually coercive behavior. Participants (N = 146) are randomly assigned to the intervention or control group. Outcomes include rape myth acceptance, victim empathy, attraction to sexual…

  7. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  8. Reanalysis of the DEMS Nested Case-Control Study of Lung Cancer and Diesel Exhaust: Suitability for Quantitative Risk Assessment

    PubMed Central

    Crump, Kenny S; Van Landingham, Cynthia; Moolgavkar, Suresh H; McClellan, Roger

    2015-01-01

    The International Agency for Research on Cancer (IARC) in 2012 upgraded its hazard characterization of diesel engine exhaust (DEE) to “carcinogenic to humans.” The Diesel Exhaust in Miners Study (DEMS) cohort and nested case-control studies of lung cancer mortality in eight U.S. nonmetal mines were influential in IARC’s determination. We conducted a reanalysis of the DEMS case-control data to evaluate its suitability for quantitative risk assessment (QRA). Our reanalysis used conditional logistic regression and adjusted for cigarette smoking in a manner similar to the original DEMS analysis. However, we included additional estimates of DEE exposure and adjustment for radon exposure. In addition to applying three DEE exposure estimates developed by DEMS, we applied six alternative estimates. Without adjusting for radon, our results were similar to those in the original DEMS analysis: all but one of the nine DEE exposure estimates showed evidence of an association between DEE exposure and lung cancer mortality, with trend slopes differing only by about a factor of two. When exposure to radon was adjusted, the evidence for a DEE effect was greatly diminished, but was still present in some analyses that utilized the three original DEMS DEE exposure estimates. A DEE effect was not observed when the six alternative DEE exposure estimates were utilized and radon was adjusted. No consistent evidence of a DEE effect was found among miners who worked only underground. This article highlights some issues that should be addressed in any use of the DEMS data in developing a QRA for DEE. PMID:25857246

  9. An evaluation of prospective motion correction (PMC) for high resolution quantitative MRI

    PubMed Central

    Callaghan, Martina F.; Josephs, Oliver; Herbst, Michael; Zaitsev, Maxim; Todd, Nick; Weiskopf, Nikolaus

    2015-01-01

    Quantitative imaging aims to provide in vivo neuroimaging biomarkers with high research and diagnostic value that are sensitive to underlying tissue microstructure. In order to use these data to examine intra-cortical differences or to define boundaries between different myelo-architectural areas, high resolution data are required. The quality of such measurements is degraded in the presence of motion hindering insight into brain microstructure. Correction schemes are therefore vital for high resolution, whole brain coverage approaches that have long acquisition times and greater sensitivity to motion. Here we evaluate the use of prospective motion correction (PMC) via an optical tracking system to counter intra-scan motion in a high resolution (800 μm isotropic) multi-parameter mapping (MPM) protocol. Data were acquired on six volunteers using a 2 × 2 factorial design permuting the following conditions: PMC on/off and motion/no motion. In the presence of head motion, PMC-based motion correction considerably improved the quality of the maps as reflected by fewer visible artifacts and improved consistency. The precision of the maps, parameterized through the coefficient of variation in cortical sub-regions, showed improvements of 11–25% in the presence of deliberate head motion. Importantly, in the absence of motion the PMC system did not introduce extraneous artifacts into the quantitative maps. The PMC system based on optical tracking offers a robust approach to minimizing motion artifacts in quantitative anatomical imaging without extending scan times. Such a robust motion correction scheme is crucial in order to achieve the ultra-high resolution required of quantitative imaging for cutting edge in vivo histology applications. PMID:25859178

  10. Quantitative evaluation by measurement and modeling of the variations in dose distributions deposited in mobile targets.

    PubMed

    Ali, Imad; Alsbou, Nesreen; Taguenang, Jean-Michel; Ahmad, Salahuddin

    2017-03-03

    The objective of this study is to quantitatively evaluate variations of dose distributions deposited in mobile target by measurement and modeling. The effects of variation in dose distribution induced by motion on tumor dose coverage and sparing of normal tissues were investigated quantitatively. The dose distributions with motion artifacts were modeled considering different motion patterns that include (a) motion with constant speed and (b) sinusoidal motion. The model predictions of the dose distributions with motion artifacts were verified with measurement where the dose distributions from various plans that included three-dimensional conformal and intensity-modulated fields were measured with a multiple-diode-array detector (MapCheck2), which was mounted on a mobile platform that moves with adjustable motion parameters. For each plan, the dose distributions were then measured with MapCHECK2 using different motion amplitudes from 0-25 mm. In addition, mathematical modeling was developed to predict the variations in the dose distributions and their dependence on the motion parameters that included amplitude, frequency and phase for sinusoidal motions. The dose distributions varied with motion and depended on the motion pattern particularly the sinusoidal motion, which spread out along the direction of motion. Study results showed that in the dose region between isocenter and the 50% isodose line, the dose profile decreased with increase of the motion amplitude. As the range of motion became larger than the field length along the direction of motion, the dose profiles changes overall including the central axis dose and 50% isodose line. If the total dose was delivered over a time much longer than the periodic time of motion, variations in motion frequency and phase do not affect the dose profiles. As a result, the motion dose modeling developed in this study provided quantitative characterization of variation in the dose distributions induced by motion, which

  11. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  12. [Evaluation of the embryotoxic risk of industrial chemicals in pregnancy].

    PubMed

    Spielmann, H

    1986-06-01

    For the first time exposure levels during pregnancy have been evaluated for industrial chemicals in the German list of "Maximal occupational exposure limits and biological tolerance levels of occupational chemicals 1985" (MAK-Werte-Liste). According to this evaluation only a single substance (methylmercury) is embryotoxic in man, a prenatal risk cannot be excluded for eight chemicals, and 18 chemicals are safe at occupational exposure limits (MAK-Werte). Furthermore, pregnant women should avoid exposure to any of the 112 carcinogenic chemicals of the list and to 26 substances which are under evaluation for embryotoxic properties. Occupational chemicals are subdivided into four pregnancy riskgroups and discussed with respect to prenatal counselling.

  13. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    PubMed

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine.

  14. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  15. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way.

  16. Quantitative evaluation of wall heat loads by lost fast ions in the Large Helical Device

    NASA Astrophysics Data System (ADS)

    Morimoto, Junki; Suzuki, Yasuhiro; Seki, Ryosuke

    2016-10-01

    In fusion plasmas, fast ions are produced by neutral beam injections (NBI), ion cyclotron heating (ICH) and fusion reactions. Some of fast ions are lost from fusion plasmas because of some kinds of drift and instability. These lost fast ions may cause damages on plasma facing components such as divertors and diagnostic instruments in fusion reactors. Therefore, wall heat loads by lost fast ions in the Large Helical Device (LHD) is under investigation. For this purpose, we have been developing the Monte-Carlo code for the quantitative evaluation of wall heat loads based on following the guiding center orbits of fast ions. Using this code, we investigate wall heat loads and hitting points of lost fast ions produced by NBI in LHD. Magnetic field configurations, which depend on beta values, affect orbits of fast ions and wall heat loads. Therefore, the wall heat loads by fast ions in equilibrium magnetic fields including finite beta effect and magnetic islands are quantitatively evaluated. The differences of wall heat loads and particle deposition patterns for cases of the vacuum field and various beta equilibrium fields will be presented at the meeting.

  17. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  18. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  19. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise.

  20. Quantitative evaluation of malignant gliomas damage induced by photoactivation of IR700 dye

    PubMed Central

    Sakuma, Morito; Kita, Sayaka; Higuchi, Hideo

    2016-01-01

    Abstract The processes involved in malignant gliomas damage were quantitatively evaluated by microscopy. The near-infrared fluorescent dye IR700 that is conjugated to an anti-CD133 antibody (IR700-CD133) specifically targets malignant gliomas (U87MG) and stem cells (BT142) and is endocytosed into the cells. The gliomas are then photodamaged by the release of reactive oxygen species (ROS) and the heat induced by illumination of IR700 by a red laser, and the motility of the vesicles within these cells is altered as a result of cellular damage. To investigate these changes in motility, we developed a new method that measures fluctuations in the intensity of phase-contrast images obtained from small areas within cells. The intensity fluctuation in U87MG cells gradually decreased as cell damage progressed, whereas the fluctuation in BT142 cells increased. The endocytosed IR700 dye was co-localized in acidic organelles such as endosomes and lysosomes. The pH in U87MG cells, as monitored by a pH indicator, was decreased and then gradually increased by the illumination of IR700, while the pH in BT142 cells increased monotonically. In these experiments, the processes of cell damage were quantitatively evaluated according to the motility of vesicles and changes in pH. PMID:27877897

  1. Quantitative analysis and chromatographic fingerprinting for the quality evaluation of Scutellaria baicalensis Georgi using capillary electrophoresis.

    PubMed

    Yu, Ke; Gong, Yifei; Lin, Zhongying; Cheng, Yiyu

    2007-01-17

    Quantitative analysis and chromatographic fingerprinting for the quality evaluation of a Chinese herb Scutellaria baicalensis Georgi using capillary electrophoresis (CE) technique was developed. The separation was performed with a 50.0cm (42.0cm to the detector window)x75mum i.d. fused-silica capillary, and the CE fingerprint condition was optimized using the combination of central composite design and multivariate analysis. The optimized buffer system containing 15mM borate, 40mM phosphate, 15mM SDS, 15% (v/v) acetonitrile and 7.5% (v/v) 2-propanol was employed for the method development, and the baseline separation was achieved within 15min. The determination of the major active components (Baicalin, Baicalein and Wogonin) was carried out using the optimized CE condition. Good linear relationships were provided over the investigated concentration ranges (the values of R(2): 0.9997 for Baicalin, 0.9992 for Baicalein, and 0.9983 for Wogonin, respectively). The average recoveries of these target components ranged between 96.1-105.6%, 98.6-105.2%, and 96.3-105.0%, respectively. CE fingerprints combined with the quantitative analysis can be used for the quality evaluation of S. baicalensis.

  2. Quantitative evaluation of cervical cord compression by computed tomographic myelography in Thoroughbred foals

    PubMed Central

    YAMADA, Kazutaka; SATO, Fumio; HADA, Tetsuro; HORIUCHI, Noriyuki; IKEDA, Hiroki; NISHIHARA, Kahori; SASAKI, Naoki; KOBAYASHI, Yoshiyasu; NAMBO, Yasuo

    2016-01-01

    ABSTRACT Five Thoroughbred foals (age, 8–33 weeks; median age, 31 weeks; weight, 122–270 kg; median weight, 249 kg) exhibiting ataxia with suspected cervical myelopathy (n=4) and limb malformation (n=1) were subjected to computed tomographic (CT) myelography. The areas of the subarachnoid space and cervical cord were measured on transverse CT images. The area of the cervical cord was divided by the area of subarachnoid space, and stenosis ratios were quantitatively evaluated and compared on the basis of histopathological examination. The sites with a ratio above 52.8% could have been primary lesion sites in the histopathological examination, although one site with a ratio of 54.1% was not a primary lesion site. Therefore, in this study, a ratio between 52.8–54.1% was suggested to be borderline for physical compression that damages the cervical cord. All the cervical vertebrae could not be scanned in three of the five cases. Therefore, CT myelography is not a suitable method for locating the site of compression, but it should be used for quantitative evaluation of cervical stenosis diagnosed by conventional myelography. In conclusion, the stenosis ratios determined using CT myelography could be applicable for detecting primary lesion sites in the cervical cord. PMID:27974873

  3. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  4. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images

    NASA Astrophysics Data System (ADS)

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-01

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05) Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  5. Quantitative evaluation of peptide-extraction methods by HPLC-triple-quad MS-MS.

    PubMed

    Du, Yan; Wu, Dapeng; Wu, Qian; Guan, Yafeng

    2015-02-01

    In this study, the efficiency of five peptide-extraction methods—acetonitrile (ACN) precipitation, ultrafiltration, C18 solid-phase extraction (SPE), dispersed SPE with mesoporous carbon CMK-3, and mesoporous silica MCM-41—was quantitatively investigated. With 28 tryptic peptides as target analytes, these methods were evaluated on the basis of recovery and reproducibility by using high-performance liquid chromatography-triple-quad tandem mass spectrometry in selected-reaction-monitoring mode. Because of the distinct extraction mechanisms of the methods, their preferences for extracting peptides of different properties were revealed to be quite different, usually depending on the pI values or hydrophobicity of peptides. When target peptides were spiked in bovine serum albumin (BSA) solution, the extraction efficiency of all the methods except ACN precipitation changed significantly. The binding of BSA with target peptides and nonspecific adsorption on adsorbents were believed to be the ways through which BSA affected the extraction behavior. When spiked in plasma, the performance of all five methods deteriorated substantially, with the number of peptides having recoveries exceeding 70% being 15 for ACN precipitation, and none for the other methods. Finally, the methods were evaluated in terms of the number of identified peptides for extraction of endogenous plasma peptides. Only ultrafiltration and CMK-3 dispersed SPE performed differently from the quantitative results with target peptides, and the wider distribution of the properties of endogenous peptides was believed to be the main reason.

  6. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia.

  7. Quantitative evaluation of stemflow flux during the rainfall-discharge process in a forested area

    NASA Astrophysics Data System (ADS)

    Ikawa, R.; Shimada, J.; Shimizu, T.

    2006-12-01

    Stemflow is very important as a point spot input of precipitation and tree solutes to the ground surface in a forest. However, it has not been attached importance for its hydrological significance because of its quantitative contribution per unit area when compared to throughfall. In the densely forested area with relatively high rainfall, some studies recently point out that stemflow has a significant influence on runoff generation, soil erosion, groundwater recharge, soil solution chemistry, and the distribution of understory vegetation and epiphytes (Levia and Frost, 2003). It is known that there exist clear differences of isotopic composition and chemistries in the gross rainfall, throughfall, and stemflow, even in a rainfall event. In order to evaluate the stemflow contribution for the infiltration into a forest soil and groundwater, the precise isotopic observation for rainfall and river discharge water during rainfall-discharge process has been conducted in a densely forested headwater catchment of Kahoku experimental forest (KHEW: 33o08'N, 133o43'E) , Kyusyu island, Japan, since June, 2004. Water samples of gross rainfall, throughfall, stemflow, and riverwater were collected every hour using automatic water sampler. These samples were analyzed for deuterium and oxygen stable isotopes, inorganic water chemistry, and dissolved Silica. To evaluate the stemflow contribution during the rainfall-discharge process, catchments scale tank model was considered by using stemflow and throughfall as an input, and an isotopic fluctuation of river water during rainfall event was calculated by this model which was evaluated by the observed isotopic fluctuation in the river water. In the AGU fall meeting, we will explain more precisely about the quantitative evaluation method of stemflow contribution during rainfall-discharge process by using chemical isotopic data and tank model.

  8. A novel integrated approach to quantitatively evaluate the efficiency of extracellular polymeric substances (EPS) extraction process.

    PubMed

    Sun, Min; Li, Wen-Wei; Yu, Han-Qing; Harada, Hideki

    2012-12-01

    A novel integrated approach is developed to quantitatively evaluate the extracellular polymeric substances (EPS) extraction efficiency after taking into account EPS yield, EPS damage, and cell lysis. This approach incorporates grey relational analysis and fuzzy logic analysis, in which the evaluation procedure is established on the basis of grey relational coefficients generation, membership functions construction, and fuzzy rules description. The flocculation activity and DNA content of EPS are chosen as the two evaluation responses. To verify the feasibility and effectiveness of this integrated approach, EPS from Bacillus megaterium TF10 are extracted using five different extraction methods, and their extraction efficiencies are evaluated as one real case study. Based on the evaluation results, the maximal extraction grades and corresponding optimal extraction times of the five extraction methods are ordered as EDTA, 10 h > formaldehyde + NaOH, 60 min > heating, 120 min > ultrasonication, 30 min > H₂SO₄, 30 min > control. The proposed approach here offers an effective tool to select appropriate EPS extraction methods and determine the optimal extraction conditions.

  9. The evaluation of natural attenuation processes in ecological risk assessments

    SciTech Connect

    Swindoll, C.M.; Dziuk, L.J. |

    1994-12-31

    The intent of the ecological risk assessment (ERA) process is to provide the scientific basis for making remediation decisions that are protective of the environment. In some instances, remedial actions may result in more damage to area flora and fauna than is warranted based on the long-term risk of the chemical stressor; this is particularly true of wetlands. To minimize the potential for putting the site`s ecosystem at greater risk because of remediation, the ERA should include an evaluation of ``no action`` and ``nondestructive`` scenarios. An essential component of this evaluation is an assessment of the natural attenuation of the chemical stressor. Natural processes including biodegradation, hydrolysis, photodegradation, speciation, and complexation can be important to the mitigation of long-term ecological impact of chemical substances. The importance of natural processes for the attenuation of contaminants in aquifers is recognized by both the regulatory and scientific communities and has been adopted by several states as a viable remedial alternative. The potential for natural attenuation to reduce environmental risk is greater for surface environments than for the subsurface. The rationale, methodology, and tools available for evaluating natural attenuation in the context of the ERA process will be presented. Specific examples of implementing this approach at several industrial sites and benefits, including the effective utilization of limited regulatory and industrial resources, will be discussed.

  10. A quantitative release assessment for the noncommercial movement of companion animals: risk of rabies reintroduction to the United kingdom.

    PubMed

    Goddard, A D; Donaldson, N M; Horton, D L; Kosmider, R; Kelly, L A; Sayers, A R; Breed, A C; Freuling, C M; Müller, T; Shaw, S E; Hallgren, G; Fooks, A R; Snary, E L

    2012-10-01

    In 2004, the European Union (EU) implemented a pet movement policy (referred to here as the EUPMP) under EU regulation 998/2003. The United Kingdom (UK) was granted a temporary derogation from the policy until December 2011 and instead has in place its own Pet Movement Policy (Pet Travel Scheme (PETS)). A quantitative risk assessment (QRA) was developed to estimate the risk of rabies introduction to the UK under both schemes to quantify any change in the risk of rabies introduction should the UK harmonize with the EU policy. Assuming 100 % compliance with the regulations, moving to the EUPMP was predicted to increase the annual risk of rabies introduction to the UK by approximately 60-fold, from 7.79 × 10(-5) (5.90 × 10(-5), 1.06 × 10(-4)) under the current scheme to 4.79 × 10(-3) (4.05 × 10(-3), 5.65 × 10(-3)) under the EUPMP. This corresponds to a decrease from 13,272 (9,408, 16,940) to 211 (177, 247) years between rabies introductions. The risks associated with both the schemes were predicted to increase when less than 100 % compliance was assumed, with the current scheme of PETS and quarantine being shown to be particularly sensitive to noncompliance. The results of this risk assessment, along with other evidence, formed a scientific evidence base to inform policy decision with respect to companion animal movement.

  11. Quantitative microbial risk assessment related to urban wastewater and lagoon water reuse in Abidjan, Côte d'Ivoire.

    PubMed

    Yapo, R I; Koné, B; Bonfoh, B; Cissé, G; Zinsstag, J; Nguyen-Viet, H

    2014-06-01

    We assessed the infection risks related to the use of wastewater in Abidjan, Côte d'Ivoire, by using quantitative microbial risk assessment (QMRA). Giardia lamblia and Escherichia coli were isolated and identified in wastewater samples from the canal and lagoon. The exposure assessment was conducted using a cross-sectional survey by questionnaire with 150 individuals who were in contact with the wastewater during their daily activities of swimming, fishing, washing, and collecting materials for reuse. Risk was characterised using the Monte Carlo simulation with 10,000 iterations. Results showed high contamination of water by G. lamblia and E. coli (12.8 CFU/100 mL to 2.97 × 10(4)CFU/100 mL and from 0 cyst/L to 18.5 cysts/L, respectively). Estimates of yearly average infection risks for E. coli (90.07-99.90%, assuming that 8% of E. coli were E. coli O157:H7) and G. lamblia (9.4-34.78%) were much higher than the acceptable risk (10(-4)). These results s