Science.gov

Sample records for quantitative risk evaluation

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  3. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention.

    PubMed

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2010-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan's current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 x 10(-8) (95th percentile: 3.20 x 10(-7)). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  4. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  5. A quantitative approach for integrating multiple lines of evidence for the evaluation of environmental health risks

    PubMed Central

    Schleier III, Jerome J.; Marshall, Lucy A.; Davis, Ryan S.

    2015-01-01

    Decision analysis often considers multiple lines of evidence during the decision making process. Researchers and government agencies have advocated for quantitative weight-of-evidence approaches in which multiple lines of evidence can be considered when estimating risk. Therefore, we utilized Bayesian Markov Chain Monte Carlo to integrate several human-health risk assessment, biomonitoring, and epidemiology studies that have been conducted for two common insecticides (malathion and permethrin) used for adult mosquito management to generate an overall estimate of risk quotient (RQ). The utility of the Bayesian inference for risk management is that the estimated risk represents a probability distribution from which the probability of exceeding a threshold can be estimated. The mean RQs after all studies were incorporated were 0.4386, with a variance of 0.0163 for malathion and 0.3281 with a variance of 0.0083 for permethrin. After taking into account all of the evidence available on the risks of ULV insecticides, the probability that malathion or permethrin would exceed a level of concern was less than 0.0001. Bayesian estimates can substantially improve decisions by allowing decision makers to estimate the probability that a risk will exceed a level of concern by considering seemingly disparate lines of evidence. PMID:25648367

  6. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... HUMAN SERVICES Food and Drug Administration Use of Influenza Disease Models To Quantitatively Evaluate... public workshop entitled: ``Use of Influenza Disease Models to Quantitatively Evaluate the Benefits and... hypothetical influenza vaccine, and to seek from a range of experts, feedback on the current version of...

  7. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  8. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    PubMed

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  9. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications. PMID:24043593

  10. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  11. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  12. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data

  13. Development of quantitative risk acceptance criteria

    SciTech Connect

    Griesmeyer, J. M.; Okrent, D.

    1981-01-01

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  14. Rat- and human-based risk estimates of lung cancer from occupational exposure to poorly-soluble particles: A quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Kuempel, E. D.; Smith, R. J.; Dankovic, D. A.; Stayner, L. T.

    2009-02-01

    In risk assessment there is a need for quantitative evaluation of the capability of animal models to predict disease risks in humans. In this paper, we compare the rat- and human-based excess risk estimates for lung cancer from working lifetime exposures to inhaled poorly-soluble particles. The particles evaluated include those for which long-term dose-response data are available in both species, i.e., coal dust, carbon black, titanium dioxide, silica, and diesel exhaust particulate. The excess risk estimates derived from the rat data were generally lower than those derived from the human studies, and none of the rat- and human-based risk estimates were significantly different (all p-values>0.05). Residual uncertainty in whether the rat-based risk estimates would over- or under-predict the true excess risks of lung cancer from inhaled poorly-soluble particles in humans is due in part to the low power of the available human studies, limited particle size exposure data for humans, and ambiguity about the best animal models and extrapolation methods.

  15. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  16. Screening Risk Evaluation methodology

    SciTech Connect

    Hopper, K.M.

    1994-06-01

    The Screening Risk Evaluation (SRE) Guidance document is a set of guidelines provided for the uniform implementation of SREs performed on D&D facilities. These guidelines are designed specifically for the completion of the second (semi-quantitative screening) phase of the D&D Risk-Based Process. The SRE Guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the risk to human health and the environment from ongoing or probable releases within a one year time period. The Worker Exposure Index (WEI) calculates the risk to workers, occupants, and visitors in D&D facilities of contaminant exposure. The Future Release Index (FRI) calculates the risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risk-to human health due to factors other than that of contaminants. The index of Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, determined on a project by project basis. The SRE is the first and most important step in the overall D&D project level decision making process.

  17. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  18. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  19. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in

  20. [Quantitative evaluation of health risk associated with occupational inhalation exposure to vinyl chloride at production plants in Poland].

    PubMed

    Szymczak, W

    1997-01-01

    Vinyl chloride is classified by the IARC in group 1-human carcinogens. In Poland occupational exposure to vinyl chloride is found among workers employed in many branches of industry, among others in the industry of vinyl chloride synthesis and polymerization as well as in the plastics, footwear, rubber, pharmaceutical and metallurgical industries. Concentrations observed range from the noon-determinable level to 90 mg/m3, at the MAC value equal to 5 mg/m3. Neoplasm of liver is a major carcinogenic effect of vinyl chloride. Hence, the health assessment focused on this critical risk. Four different linear dose-response models, developed by several authors and based on results of different epidemiological studies, were used to characterise the extent of cancer risk depending on the level of vinyl chloride concentrations. The estimated risk related to a forty-year employment under exposure equal to MAC values (5 mg/m3) fell within the range from 2.9.10(-4) to 2.6.10(-3). As the figures depict it did not exceed the acceptable level (10(-3)). PMID:9273438

  1. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  2. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  3. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. PMID:26456933

  4. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  5. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  6. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  7. Validation of biological markers for quantitative risk assessment.

    PubMed Central

    Schulte, P; Mazzuckelli, L F

    1991-01-01

    The evaluation of biological markers is recognized as necessary to the future of toxicology, epidemiology, and quantitative risk assessment. For biological markers to become widely accepted, their validity must be ascertained. This paper explores the range of considerations that compose the concept of validity as it applies to the evaluation of biological markers. Three broad categories of validity (measurement, internal study, and external) are discussed in the context of evaluating data for use in quantitative risk assessment. Particular attention is given to the importance of measurement validity in the consideration of whether to use biological markers in epidemiologic studies. The concepts developed in this presentation are applied to examples derived from the occupational environment. In the first example, measurement of bromine release as a marker of ethylene dibromide toxicity is shown to be of limited use in constructing an accurate quantitative assessment of the risk of developing cancer as a result of long-term, low-level exposure. This example is compared to data obtained from studies of ethylene oxide, in which hemoglobin alkylation is shown to be a valid marker of both exposure and effect. PMID:2050067

  8. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  9. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  10. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  11. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  12. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  13. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. PMID:19945145

  16. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  17. [QUANTITATIVE DNA EVALUATION OF THE HIGH CARCINOGENIC RISK OF HUMAN PAPILLOMA VIRUSES AND HUMAN HERPES VIRUSES IN MALES WITH FERTILITY DISORDERS].

    PubMed

    Evdokimov, V V; Naumenko, V A; Tulenev, Yu A; Kurilo, L F; Kovalyk, V P; Sorokina, T M; Lebedeva, A L; Gomberg, M A; Kushch, A A

    2016-01-01

    Infertility is an actual medical and social problem. In 50% of couples it is associated with the male factor and in more than 50% of cases the etiology of the infertility remains insufficiently understood. The goal of this work was to study the prevalence and to perform quantitative analysis of the human herpes viruses (HHV) and high carcinogenic risk papilloma viruses (HR HPV) in males with infertility, as well as to assess the impact of these infections on sperm parameters. Ejaculate samples obtained from 196 males fall into 3 groups. Group 1 included men with the infertility of unknown etiology (n = 112); group 2, patients who had female partners with the history of spontaneous abortion (n = 63); group 3 (control), healthy men (n = 21). HHV and HR HPV DNA in the ejaculates were detected in a total of 42/196 (21.4%) males: in 31 and 11 patients in groups 1 and 2, respectively (p > 0.05) and in none of healthy males. HHV were detected in 24/42; HR HPV, in 18/42 males (p > 0.05) without significant difference between the groups. Among HR HPV genotypes of the clade A9 in ejaculate were more frequent (14/18, p = 0.04). Comparative analysis of the sperm parameters showed that in the ejaculates of the infected patients sperm motility as well as the number of morphologically normal cells were significantly reduced compared with the healthy men. The quantification of the viral DNA revealed that in 31% of the male ejaculates the viral load was high: > 3 Ig10/100000 cells. Conclusion. The detection of HHV and HR HPV in the ejaculate is associated with male infertility. Quantification of the viral DNA in the ejaculate is a useful indicator for monitoring viral infections in infertility and for decision to start therapy. PMID:27451497

  18. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  19. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  20. A Quantitative Framework to Evaluate Proarrhythmic Risk in a First-in-Human Study to Support Waiver of a Thorough QT Study.

    PubMed

    Nelson, C H; Wang, L; Fang, L; Weng, W; Cheng, F; Hepner, M; Lin, J; Garnett, C; Ramanathan, S

    2015-12-01

    The effects of GS-4997 (apoptosis signal-regulating kinase 1 inhibitor) on cardiac repolarization were evaluated using a systematic modeling approach in a first-in-human (FIH) study. High quality, intensive, time-matched 12-lead electrocardiograms (ECGs) were obtained in this placebo-controlled, single and multiple-ascending dose study in healthy subjects. Model development entailed linearity and hysteresis assessments; GS-4997/metabolite concentration vs. baseline-adjusted QTcF (ΔQTcF) relationships were determined using linear mixed effects models. Bootstrapping was used to obtain 90% confidence intervals (CIs) of predicted placebo-corrected ΔQTcF (ΔΔQTcF). The upper bound of 90% CI for predicted ΔΔQTcF was <10 msec at therapeutic and supratherapeutic GS-4997/metabolite levels, indicating the absence of a QT prolongation effect. Model performance/suitability was assessed using sensitivity/specificity analyses and diagnostic evaluations. This comprehensive methodology, supported by clinical pharmacology characteristics, was deemed adequate to assess the proarrhythmic risk of GS-4997/metabolite by the US Food and Drug Administration and European Medicines Agency resulting in a successful waiver from a dedicated thorough QT (TQT) study. PMID:26259519

  1. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  2. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  3. Significance of quantitative enzyme-linked immunosorbent assay (ELISA) results in evaluation of three ELISAs and Western blot tests for detection of antibodies to human immunodeficiency virus in a high-risk population.

    PubMed Central

    Nishanian, P; Taylor, J M; Korns, E; Detels, R; Saah, A; Fahey, J L

    1987-01-01

    The characteristics of primary (first) tests with three enzyme-linked immunosorbent assay (ELISA) kits for human immunodeficiency virus (HIV) antibody were determined. The three ELISAs were performed on 3,229, 3,130, and 685 specimens from high-risk individuals using the Litton (LT; Litton Bionetics Laboratory Products, Charleston, S.C.), Dupont (DP; E. I. du Pont de Nemours & Co., Inc., Wilmington, Del.), and Genetic Systems (GS; Genetic Systems, Seattle, Wash.) kits, respectively. Evaluation was based on the distribution of quantitative test results (such as optical densities), a comparison with Western blot (WB) results, reproducibility of the tests, and identification of seroconverters. The performances of the GS and the DP kits were good by all four criteria and exceeded that of the LT kit. Primary ELISA-negative results were not always confirmed with repeat ELISA and by WB testing. The largest percentage of these unconfirmed negative test results came from samples with quantitative results in the fifth percentile nearest the cutoff. Thus, supplementary testing was indicated for samples with test results in this borderline negative range. Similarly, borderline positive primary ELISA results that were quantitatively nearest (fifth percentile) the cutoff value were more likely to be antibody negative on supplementary testing than samples with high antibody values. In this study, results of repeated tests by GS ELISA showed the least change from first test results. DP ELISA showed more unconfirmed primary positive test results, and LT ELISA showed more unconfirmed primary negative test results. Designation of a specimen with a single ELISA quantitative level near the cutoff value as positive or negative should be viewed with skepticism. A higher than normal proportion of specimens with high negative optical densities by GS ELISA (fifth percentile nearest the cutoff) and also negative by WB were found to be from individuals in the process of seroconversion. PMID

  4. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  5. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  6. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    PubMed

    Liu, Yao; Li, Jianying; Liu, Xiaoyong; Liu, Xudong; Khawar, Waqaar; Zhang, Xinyan; Wang, Fan; Chen, Xiaoxin; Sun, Zheng

    2015-01-01

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC). Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance), and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma). The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102), oral leukoplakia (OLK) patients (n=82), and OSCC patients (n=93), and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR) which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM) was found to be optimal with a high sensitivity (median>0.98) and specificity (median>0.99). With the SVM model, we constructed an oral cancer risk index (OCRI) which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations. PMID:25978541

  7. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  8. Quantitative nondestructive evaluation: Requirements for tomorrow's reliability

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.

    1991-01-01

    Quantitative Nondestructive Evaluation (QNDE) is the technology of measurement, analysis, and prediction of the state of material/structural systems for safety, reliability, and mission assurance. QNDE has impact on everyday life from the cars we drive, the planes we fly, the buildings we work or live in, literally to the infrastructure of our world. Here, researchers highlight some of the new sciences and technologies that are part of a safer, cost effective tomorrow. Specific technologies that are discussed are thermal QNDE of aircraft structural integrity, ultrasonic QNDE for materials characterization, and technology spinoffs from aerospace to the medical sector. In each case, examples are given of how new requirements result in enabling measurement technologies, which in turn change the boundaries of design/practice.

  9. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  10. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  11. DIETARY RISK EVALUATION SYSTEM: DRES

    EPA Science Inventory

    The Dietary Risk Evaluation System (DRES) estimates exposure to pesticides in the diet by combining information concerning residues on raw agricultural commodities with information on consumption of those commodities. It then compares the estimated exposure level to a toxicologi...

  12. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  13. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  14. Molecular sensitivity threshold of wet mount and an immunochromatographic assay evaluated by quantitative real-time PCR for diagnosis of Trichomonas vaginalis infection in a low-risk population of childbearing women.

    PubMed

    Leli, Christian; Castronari, Roberto; Levorato, Lucia; Luciano, Eugenio; Pistoni, Eleonora; Perito, Stefano; Bozza, Silvia; Mencacci, Antonella

    2016-06-01

    Vaginal trichomoniasis is a sexually transmitted infection caused by Trichomonas vaginalis, a flagellated protozoan. Diagnosis of T. vaginalis infection is mainly performed by wet mount microscopy, with a sensitivity ranging from 38% to 82%, compared to culture, still considered the gold standard. Commercial immunochromatographic tests for monoclonal-antibody-based detection have been introduced as alternative methods for diagnosis of T. vaginalis infection and have been reported in some studies to be more sensitive than wet mount. Real-time PCR methods have been recently developed, with optimal sensitivity and specificity. The aim of this study was to evaluate whether there is a molecular sensitivity threshold for both wet mount and imunochromatographic assays. To this aim, a total of 1487 low-risk childbearing women (median age 32 years, interquartile range 27-37) were included in the study, and underwent vaginal swab for T. vaginalis detection by means of a quantitative real-time PCR assay, wet mount and an immunochromatographic test. Upon comparing the results, prevalence values observed were 1.3% for real-time PCR, 0.5% for microscopic examination, and 0.8% for the immunochromatographic test. Compared to real-time PCR, wet mount sensitivity was 40% (95% confidence interval 19.1% to 63.9%) and specificity was 100% (95% CI 99.7% to 100%). The sensitivity and specificity of the immunochromatographic assay were 57.9% (95% CI 33.5% to 79.8%) and 99.9% (95% CI 99.6% to 100%), respectively. Evaluation of the wet mount results and those of immunochromatographic assay detection in relation to the number of T. vaginalis DNA copies detected in vaginal samples showed that the lower identification threshold for both wet mount (chi-square 6.1; P = 0.016) and the immunochromatographic assay (chi-square 10.7; P = 0.002) was ≥100 copies of T. vaginalis DNA/5 mcl of eluted DNA. PMID:27367320

  15. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes. PMID:18657068

  16. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level. PMID:26202064

  17. Production Risk Evaluation Program (PREP) - summary

    SciTech Connect

    Kjeldgaard, E.A.; Saloio, J.H.; Vannoni, M.G.

    1997-03-01

    Nuclear weapons have been produced in the US since the early 1950s by a network of contractor-operated Department of Energy (DOE) facilities collectively known as the Nuclear Weapon Complex (NWC). Recognizing that the failure of an essential process might stop weapon production for a substantial period of time, the DOE Albuquerque Operations office initiated the Production Risk Evaluation Program (PREP) at Sandia National Laboratories (SNL) to assess quantitatively the potential for serious disruptions in the NWC weapon production process. PREP was conducted from 1984-89. This document is an unclassified summary of the effort.

  18. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  19. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  20. A Method for Quantitatively Evaluating a University Library Collection

    ERIC Educational Resources Information Center

    Golden, Barbara

    1974-01-01

    The acquisitions department of the University of Nebraska at Omaha library conducted a quantitative evaluation of the library's book collection in relation to the course offerings of the university. (Author/LS)

  1. Quantitative Evaluation of Management Courses: Part 1

    ERIC Educational Resources Information Center

    Cunningham, Cyril

    1973-01-01

    The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)

  2. A Program to Evaluate Quantitative Analysis Unknowns

    ERIC Educational Resources Information Center

    Potter, Larry; Brown, Bruce

    1978-01-01

    Reports on a computer batch program that will not only perform routine grading using several grading algorithms, but will also calculate various statistical measures by which the class performance can be evaluated and cumulative data collected. ( Author/CP)

  3. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  4. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety. PMID:20055976

  5. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  6. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  7. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  8. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  9. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese. PMID:26162789

  10. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. PMID:24300215

  11. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  12. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  13. Evaluating microbiological risks of biosolids land application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The transmission of pathogens by land application of untreated human and animal wastes has been known for more than 100 years. The QMRA (quantitative microbial risk assessment) process involves four basic steps: pathogen identification, exposure assessment, dose-response and risk characterization. ...

  14. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  15. Quantitative and Public Perception of Landslide Risk in Badulla, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Gunasekera, R.; Bandara, R. M. S.; Mallawatantri, A.; Saito, K.

    2009-04-01

    Landslides are often triggered by intense precipitation and are exacerbated by increased urbanisation and human activity. There is a significant risk of large scale landslides in Sri Lanka and when they do occur, they have the potential to cause devastation to property, lives and livelihoods. There are several high landslide risk areas in seven districts (Nuwara Eliya, Badulla, Ratnapura, Kegalle, Kandy, Matale and Kalutara) in Sri Lanka. These are also some of the poorest areas in the country and consequently the recovery process after catastrophic landslides become more problematic. Therefore landslide risk management is an important concern in poverty reduction strategies. We focused on the district of Badulla, Sri Lanka to evaluate the a) quantitative scientific analysis of landslide risk and b) qualitative public perception of landslides in the area. Combining high resolution, hazard and susceptibility data we quantified the risk of landslides in the area. We also evaluated the public perception of landslides in the area using participatory GIS techniques. The evaluation of public perception of landslide risk has been complemented by use of Landscan data. The framework of the methodology for Landscan data is based on using the second order administrative population data from census, each 30 arc-second cell within the administrative units receives a probability coefficient based on slope, proximity to roads and land cover. Provision of this information from these complementary methods to the regional planners help to strengthen the disaster risk reduction options and improving sustainable land use practices through enhanced public participation in the decision making and governance processes.

  16. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  17. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  18. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  19. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  20. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units. PMID:17276591

  1. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  2. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  3. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures. PMID:22296573

  4. Approach for evaluating inundation risks in urban drainage systems.

    PubMed

    Zhu, Zhihua; Chen, Zhihe; Chen, Xiaohong; He, Peiying

    2016-05-15

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers. Hence, inundation evaluation is becoming increasingly important around the world. This comprehensive assessment involves numerous indices in urban catchments, but the high-dimensional and non-linear relationship between the indices and the risk presents an enormous challenge for accurate evaluation. Therefore, an approach is hereby proposed to qualitatively and quantitatively evaluate inundation risks in urban drainage systems based on a storm water management model, the projection pursuit method, the ordinary kriging method and the K-means clustering method. This approach is tested using a residential district in Guangzhou, China. Seven evaluation indices were selected and twenty rainfall-runoff events were used to calibrate and validate the parameters of the rainfall-runoff model. The inundation risks in the study area drainage system were evaluated under different rainfall scenarios. The following conclusions are reached. (1) The proposed approach, without subjective factors, can identify the main driving factors, i.e., inundation duration, largest water flow and total flood amount in this study area. (2) The inundation risk of each manhole can be qualitatively analyzed and quantitatively calculated. There are 1, 8, 11, 14, 21, and 21 manholes at risk under the return periods of 1-year, 5-years, 10-years, 20-years, 50-years and 100-years, respectively. (3) The areas of levels III, IV and V increase with increasing rainfall return period based on analyzing the inundation risks for a variety of characteristics. (4) The relationships between rainfall intensity and inundation-affected areas are revealed by a logarithmic model. This study proposes a novel and successful approach to assessing risk in urban drainage systems and provides guidance for improving urban drainage systems and inundation preparedness. PMID:26897578

  5. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping. PMID:15876211

  6. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  7. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  8. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  9. INTEGRATED QUANTITATIVE CANCER RISK ASSESSMENT OF INORGANIC ARSENIC

    EPA Science Inventory

    This paper attempts to make an integrated risk assessment of arsenic, using data on humans exposed to arsenic via inhalation and ingestion. he data useful for making an integrated analysis and data gaps are discussed. rsenic provides a rare opportunity to compare the cancer risk ...

  10. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  11. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  12. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  13. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  14. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  15. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-01-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  16. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-09-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  17. Factors Distinguishing between Achievers and At Risk Students: A Qualitative and Quantitative Synthesis

    ERIC Educational Resources Information Center

    Eiselen, R.; Geyser, H.

    2003-01-01

    The purpose of this article is to identify factors that distinguish between Achievers and At Risk Students in Accounting 1A, and to explore how qualitative and quantitative research methods complement each other. Differences between the two groups were explored from both a quantitative and a qualitative perspective, focusing on study habits,…

  18. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    NASA Astrophysics Data System (ADS)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  19. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  20. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  1. Cumulative Aggregate Risk Evaluation Software

    EPA Science Inventory

    CARES is a state-of-the-art software program designed to conduct complex exposure and risk assessments for pesticides, such as the assessments required under the 1996 Food Quality Protection Act (FQPA). CARES was originally developed under the auspices of CropLife America (CLA),...

  2. Risk effectiveness evaluation of surveillance testing

    SciTech Connect

    Martorell, S.; Kim, I.S.; Samanta, P.K.; Vesely, W.E.

    1992-07-20

    In nuclear power plants surveillance tests are required to detect failures in standby safety system components as a means of assuring their availability in case of an accident. However, the performance of surveillance tests at power may have adverse impact on safety as evidenced by the operating experience of the plants. The risk associated with a test includes two different aspects: (1) a positive aspect, i.e., risk contribution detected by the test, that results from the detection of failures which occur between tests and are detected by the test, and (2) a negative aspect, i.e., risk contribution caused by the test, that includes failures and degradations which are caused by the test or are related to the performance of the test. In terms of the two different risk contributions, the risk effectiveness of a test can be simply defined as follows: a test is risk effective if the risk contribution detected by the test is greater than the risk contribution caused by the test; otherwise it is risk ineffective. The methodology presentation will focus on two important kinds of negative test risk impacts, that is, the risk impacts of test-caused transients and equipment wear-out. The evaluation results of the risk effectiveness of the test will be presented in the full paper along with the risk assessment methodology and the insights from the sensitivity analysis. These constitute the core of the NUREG/CR-5775.

  3. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  4. The Nuclear Renaissance — Implications on Quantitative Nondestructive Evaluations

    NASA Astrophysics Data System (ADS)

    Matzie, Regis A.

    2007-03-01

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  5. Assessment Tools for the Evaluation of Risk

    EPA Science Inventory

    ASTER (Assessment Tools for the Evaluation of Risk) was developed by the U.S. EPA Mid-Continent Ecology Division, Duluth, MN to assist regulators in performing ecological risk assessments. ASTER is an integration of the ECOTOXicology Database (ECOTOX; EVALUATION AND EFFECTIVE RISK COMMUNICATION WORKSHOP PROCEEDINGS

    EPA Science Inventory

    To explore a number of questions in the area of risk communications, the Interagency Task Force on Environmental Cancer and Heart and Lung Disease, held a Workshop on Evaluation and Effective Risk Communication which brought together experts from academia, government agencies, an...

  6. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue. PMID:14678210

  7. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  8. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  9. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  10. Risk evaluation of medical and industrial radiation devices

    SciTech Connect

    Jones, E.D.; Cunningham, R.E.; Rathbun, P.A.

    1994-03-01

    In 1991, the NRC, Division of Industrial and Medical Nuclear Safety, began a program to evaluate the use of probabilistic risk assessment (PRA) in regulating medical devices. This program represents an initial step in an overall plant to evaluate the use of PRA in regulating the use of nuclear by-product materials. The NRC envisioned that the use of risk analysis techniques could assist staff in ensuring that the regulatory approach was standardized, understandable, and effective. Traditional methods of assessing risk in nuclear power plants may be inappropriate to use in assessing the use of by-product devices. The approaches used in assessing nuclear reactor risks are equipment-oriented. Secondary attention is paid to the human component, for the most part after critical system failure events have been identified. This paper describes the risk methodology developed by Lawrence Livermore National Laboratory (LLNL), initially intended to assess risks associated with the use of the Gamma Knife, a gamma stereotactic radiosurgical device. For relatively new medical devices such as the Gamma Knife, the challenge is to perform a risk analysis with very little quantitative data but with an important human factor component. The method described below provides a basic approach for identifying the most likely risk contributors and evaluating their relative importance. The risk analysis approach developed for the Gamma Knife and described in this paper should be applicable to a broader class of devices in which the human interaction with the device is a prominent factor. In this sense, the method could be a prototypical model of nuclear medical or industrial device risk analysis.

  11. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  12. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated. PMID:21940164

  13. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  14. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  15. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  16. Predicting pathogen risks to aid beach management: the real value of quantitative microbial risk assessment (QMRA).

    PubMed

    Ashbolt, Nicholas J; Schoen, Mary E; Soller, Jeffrey A; Roser, David J

    2010-09-01

    There has been an ongoing dilemma for agencies that set criteria for safe recreational waters in how to provide for a seasonal assessment of a beach site versus guidance for day-to-day management. Typically an overall 'safe' criterion level is derived from epidemiologic studies of sewage-impacted beaches. The decision criterion is based on a percentile value for a single sample or a moving median of a limited number (e.g. five per month) of routine samples, which are reported at least the day after recreator exposure has occurred. The focus of this paper is how to better undertake day-to-day recreational site monitoring and management. Internationally, good examples exist where predictive empirical regression models (based on rainfall, wind speed/direction, etc.) may provide an estimate of the target faecal indicator density for the day of exposure. However, at recreational swimming sites largely impacted by non-sewage sources of faecal indicators, there is concern that the indicator-illness associations derived from studies at sewage-impacted beaches may be inappropriate. Furthermore, some recent epidemiologic evidence supports the relationship to gastrointestinal (GI) illness with qPCR-derived measures of Bacteroidales/Bacteroides spp. as well as more traditional faecal indicators, but we understand less about the environmental fate of these molecular targets and their relationship to bather risk. Modelling pathogens and indicators within a quantitative microbial risk assessment framework is suggested as a way to explore the large diversity of scenarios for faecal contamination and hydrologic events, such as from waterfowl, agricultural animals, resuspended sediments and from the bathers themselves. Examples are provided that suggest that more site-specific targets derived by QMRA could provide insight, directly translatable to management actions. PMID:20638095

  17. The Neglected Side of the Coin: Quantitative Benefit-risk Analyses in Medical Imaging.

    PubMed

    Zanzonico, Pat B

    2016-03-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the "linear no-threshold" (LNT) dose-response model. PMID:26808890

  18. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  19. Noninvasive Risk Stratification of Lung Adenocarcinoma using Quantitative Computed Tomography

    PubMed Central

    Raghunath, Sushravya; Maldonado, Fabien; Rajagopalan, Srinivasan; Karwoski, Ronald A.; DePew, Zackary S.; Bartholmai, Brian J.; Peikert, Tobias; Robb, Richard A.

    2014-01-01

    Introduction Lung cancer remains the leading cause of cancer-related deaths in the US and worldwide. Adenocarcinoma is the most common type of lung cancer and encompasses lesions with widely variable clinical outcomes. In the absence of noninvasive risk stratification, individualized patient management remains challenging. Consequently a subgroup of pulmonary nodules of the lung adenocarcinoma spectrum is likely treated more aggressively than necessary. Methods Consecutive patients with surgically resected pulmonary nodules of the lung adenocarcinoma spectrum (lesion size ≤ 3 cm, 2006–2009) and available pre-surgical high-resolution computed tomography (HRCT) imaging were identified at Mayo Clinic Rochester. All cases were classified using an unbiased Computer-Aided Nodule Assessment and Risk Yield (CANARY) approach based on the quantification of pre-surgical HRCT characteristics. CANARY-based classification was independently correlated to postsurgical progression-free survival. Results CANARY analysis of 264 consecutive patients identified three distinct subgroups. Independent comparisons of 5-year disease-free survival (DFS) between these subgroups demonstrated statistically significant differences in 5-year DFS, 100%, 72.7% and 51.4%, respectively (p = 0.0005). Conclusions Non-invasive CANARY based risk stratification identifies subgroups of patients with pulmonary nodules of the adenocarcinoma spectrum characterized by distinct clinical outcomes. This technique may ultimately improve the current expert opinion-based approach to the management of these lesions by facilitating individualized patient management. PMID:25170645

  1. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  2. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  3. Comparison of methods for quantitative evaluation of endoscopic distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Castro, Kurt; Desai, Viraj N.; Cheng, Wei-Chung; Pfefer, Joshua

    2015-03-01

    Endoscopy is a well-established paradigm in medical imaging, and emerging endoscopic technologies such as high resolution, capsule and disposable endoscopes promise significant improvements in effectiveness, as well as patient safety and acceptance of endoscopy. However, the field lacks practical standardized test methods to evaluate key optical performance characteristics (OPCs), in particular the geometric distortion caused by fisheye lens effects in clinical endoscopic systems. As a result, it has been difficult to evaluate an endoscope's image quality or assess its changes over time. The goal of this work was to identify optimal techniques for objective, quantitative characterization of distortion that are effective and not burdensome. Specifically, distortion measurements from a commercially available distortion evaluation/correction software package were compared with a custom algorithm based on a local magnification (ML) approach. Measurements were performed using a clinical gastroscope to image square grid targets. Recorded images were analyzed with the ML approach and the commercial software where the results were used to obtain corrected images. Corrected images based on the ML approach and the software were compared. The study showed that the ML method could assess distortion patterns more accurately than the commercial software. Overall, the development of standardized test methods for characterizing distortion and other OPCs will facilitate development, clinical translation, manufacturing quality and assurance of performance during clinical use of endoscopic technologies.

  4. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  5. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  6. D & D screening risk evaluation guidance

    SciTech Connect

    Robers, S.K.; Golden, K.M.; Wollert, D.A.

    1995-09-01

    The Screening Risk Evaluation (SRE) guidance document is a set of guidelines provided for the uniform implementation of SREs performed on decontamination and decommissioning (D&D) facilities. Although this method has been developed for D&D facilities, it can be used for transition (EM-60) facilities as well. The SRE guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the current risk to human health and the environment, exterior to the building, from ongoing or probable releases within a one-year time period. The Worker Exposure Index (WEI) calculates the current risk to workers, occupants and visitors inside contaminated D&D facilities due to contaminant exposure. The Future Release Index (FRI) calculates the hypothetical risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risks to human health due to factors other than that of contaminants. Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form, and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, as determined on a project-by-project basis.

  7. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  8. Evaluation of a Virucidal Quantitative Carrier Test for Surface Disinfectants

    PubMed Central

    Rabenau, Holger F.; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  9. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  10. Quantitative relations between risk, return and firm size

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Stanley, H. E.

    2009-03-01

    We analyze —for a large set of stocks comprising four financial indices— the annual logarithmic growth rate R and the firm size, quantified by the market capitalization MC. For the Nasdaq Composite and the New York Stock Exchange Composite we find that the probability density functions of growth rates are Laplace ones in the broad central region, where the standard deviation σ(R), as a measure of risk, decreases with the MC as a power law σ(R)~(MC)- β. For both the Nasdaq Composite and the S&P 500, we find that the average growth rate langRrang decreases faster than σ(R) with MC, implying that the return-to-risk ratio langRrang/σ(R) also decreases with MC. For the S&P 500, langRrang and langRrang/σ(R) also follow power laws. For a 20-year time horizon, for the Nasdaq Composite we find that σ(R) vs. MC exhibits a functional form called a volatility smile, while for the NYSE Composite, we find power law stability between σ(r) and MC.

  11. Quantitative evaluations of ankle spasticity and stiffness in neurological disorders using manual spasticity evaluator

    PubMed Central

    Peng, Qiyu; Park, Hyung-Soon; Shah, Parag; Wilson, Nicole; Ren, Yupeng; Wu, Yi-Ning; Liu, Jie; Gaebler-Spira, Deborah J.; Zhang, Li-Qun

    2013-01-01

    Spasticity and contracture are major sources of disability in people with neurological impairments that have been evaluated using various instruments: the Modified Ashworth Scale, tendon reflex scale, pendulum test, mechanical perturbations, and passive joint range of motion (ROM). These measures generally are either convenient to use in clinics but not quantitative or they are quantitative but difficult to use conveniently in clinics. We have developed a manual spasticity evaluator (MSE) to evaluate spasticity/contracture quantitatively and conveniently, with ankle ROM and stiffness measured at a controlled low velocity and joint resistance and Tardieu catch angle measured at several higher velocities. We found that the Tardieu catch angle was linearly related to the velocity, indicating that increased resistance at higher velocities was felt at further stiffer positions and, thus, that the velocity dependence of spasticity may also be position-dependent. This finding indicates the need to control velocity in spasticity evaluation, which is achieved with the MSE. Quantitative measurements of spasticity, stiffness, and ROM can lead to more accurate characterizations of pathological conditions and outcome evaluations of interventions, potentially contributing to better healthcare services for patients with neurological disorders such as cerebral palsy, spinal cord injury, traumatic brain injury, and stroke. PMID:21674395

  12. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  13. Quantitative risk assessment of FMD virus transmission via water.

    PubMed

    Schijven, Jack; Rijs, Gerard B J; de Roda Husman, Ana Maria

    2005-02-01

    Foot-and-mouth disease (FMD) is a viral disease of domesticated and wild cloven-hoofed animals. FMD virus is known to spread by direct contact between infected and susceptible animals, by animal products such as meat and milk, by the airborne route, and mechanical transfer on people, wild animals, birds, and by vehicles. During the outbreak of 2001 in the Netherlands, milk from dairy cattle was illegally discharged into the sewerage as a consequence of transport prohibition. This may lead to contaminated discharges of biologically treated and raw sewage in surface water that is given to cattle to drink. The objective of the present study was to assess the probability of infecting dairy cows that were drinking FMD virus contaminated surface water due to illegal discharges of contaminated milk. So, the following data were collected from literature: FMD virus inactivation in aqueous environments, FMD virus concentrations in milk, dilution in sewage water, virus removal by sewage treatment, dilution in surface water, water consumption of cows, size of a herd in a meadow, and dose-response data for ingested FMD virus by cattle. In the case of 1.6 x 10(2) FMD virus per milliliter in milk and discharge of treated sewage in surface water, the probability of infecting a herd of cows was estimated to be 3.3 x 10(-7) to 8.5 x 10(-5), dependent on dilution in the receiving surface water. In the case of discharge of raw sewage, all probabilities of infection were 100 times higher. In the case of little dilution in small rivers, the high level of 8.5 x 10(-3) is reached. For 10(4) times higher FMD virus concentrations in milk, the probabilities of infecting a herd of cows are high in the case of discharge of treated sewage (3.3 x 10(-3) to 5.7 x 10(-1)) and very high in the case of discharge of raw sewage (0.28-1.0). It can be concluded that illegal and uncontrolled discharges of contaminated milk into the sewerage system may lead to high risks to other cattle farms at 6-50 km

  14. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  15. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  16. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  17. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  18. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  19. Preoperative Evaluation: Estimation of Pulmonary Risk.

    PubMed

    Lakshminarasimhachar, Anand; Smetana, Gerald W

    2016-03-01

    Postoperative pulmonary complications (PPCs) are common after major non-thoracic surgery and associated with significant morbidity and high cost of care. A number of risk factors are strong predictors of PPCs. The overall goal of the preoperative pulmonary evaluation is to identify these potential, patient and procedure-related risks and optimize the health of the patients before surgery. A thorough clinical examination supported by appropriate laboratory tests will help guide the clinician to provide optimal perioperative care. PMID:26927740

  20. Abandoned metal mine stability risk evaluation.

    PubMed

    Bétournay, Marc C

    2009-10-01

    The abandoned mine legacy is critical in many countries around the world, where mine cave-ins and surface subsidence disruptions are perpetual risks that can affect the population, infrastructure, historical legacies, land use, and the environment. This article establishes abandoned metal mine failure risk evaluation approaches and quantification techniques based on the Canadian mining experience. These utilize clear geomechanics considerations such as failure mechanisms, which are dependent on well-defined rock mass parameters. Quantified risk is computed using probability of failure (probabilistics using limit-equilibrium factors of safety or applicable numerical modeling factor of safety quantifications) times a consequence impact value. Semi-quantified risk can be based on failure-case-study-based empirical data used in calculating probability of failure, and personal experience can provide qualified hazard and impact consequence assessments. The article provides outlines for land use and selection of remediation measures based on risk. PMID:19645755

  1. A risk methodology to evaluate sensitvity of plant risk to human errors

    SciTech Connect

    Samanta, P.; Wong, S.; Higgins, J.; Haber, S.; Luckas, W.

    1988-01-01

    This paper presents an evaluation of sensitivity of plant risk parameters, namely the core melt frequency and the accident sequence frequencies, to the human errors involved in various aspects of nuclear power plant operations. Results are provided using the Oconee-3 Probabilistic Risk Assessment model as an example application of the risk methodology described herein. Sensitivity analyses in probabilistic risk assessment (PRA) involve three areas: (1) a determination of the set of input parameters; in this case, various categories of human errors signifying aspects of plant operation, (2) the range over which the input parameters vary, and (3) an assessment of the sensitivity of the plant risk parameters to the input parameters which, in this case, consist of all postulated human errors, or categories of human errors. The methodology presents a categorization scheme where human errors are categorized in terms of types of activity, location, personnel involved, etc., to relate the significance of sensitivity of risk parameters to specific aspects of human performance in the nuclear plant. Ranges of variability for human errors have been developed considering the various known causes of uncertainty in human error probability estimates in PRAs. The sensitivity of the risk parameters are assessed using the event/fault tree methodology of the PRA. The results of the risk-based sensitivity evaluation using the Oconee-3 PRA as an example show the quantitative impact on the plant risk level due to variations in human error probabilities. The relative effects of various human error categories and human error sorts within the categories are also presented to identify and characterize significant human errors for effective risk management in nuclear power plant operational activities. 8 refs., 10 figs., 4 tabs.

  2. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. PMID:25542704

  3. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  4. Quantitative evaluation of atherosclerotic plaque phantom by near-infrared multispectral imaging with three wavelengths

    NASA Astrophysics Data System (ADS)

    Nagao, Ryo; Ishii, Katsunori; Awazu, Kunio

    2014-03-01

    Atherosclerosis is a primary cause of critical ischemic disease. The risk of critical event is involved the content of lipid in unstable plaque. Near-infrared (NIR) range is effective for diagnosis of atherosclerotic plaque because of the absorption peaks of lipid. NIR multispectral imaging (NIR-MSI) is suitable for the evaluation of plaque because it can provide spectroscopic information and spatial image quickly with a simple measurement system. The purpose of this study is to evaluate the lipid concentrations in plaque phantoms quantitatively with a NIR-MSI system. A NIR-MSI system was constructed with a supercontinuum light, a grating spectrometer and a MCT camera. Plaque phantoms with different concentrations of lipid were prepared by mixing bovine fat and a biological soft tissue model to mimic the different stages of unstable plaque. We evaluated the phantoms by the NIR-MSI system with three wavelengths in the band at 1200 nm. Multispectral images were processed by spectral angle mapper method. As a result, the lipid areas of phantoms were effectively highlighted by using three wavelengths. In addition, the concentrations of lipid areas were classified according to the similarity between measured spectra and a reference spectrum. These results suggested the possibility of image enhancement and quantitative evaluation of lipid in unstable plaque with a NIR-MSI.

  5. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  6. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods.

    PubMed

    Joly, Lauren E; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24-0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31-0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  7. Risk evaluation mitigation strategies: the evolution of risk management policy.

    PubMed

    Hollingsworth, Kristen; Toscani, Michael

    2013-04-01

    The United States Food and Drug Administration (FDA) has the primary regulatory responsibility to ensure that medications are safe and effective both prior to drug approval and while the medication is being actively marketed by manufacturers. The responsibility for safe medications prior to marketing was signed into law in 1938 under the Federal Food, Drug, and Cosmetic Act; however, a significant risk management evolution has taken place since 1938. Additional federal rules, entitled the Food and Drug Administration Amendments Act, were established in 2007 and extended the government's oversight through the addition of a Risk Evaluation and Mitigation Strategy (REMS) for certain drugs. REMS is a mandated strategy to manage a known or potentially serious risk associated with a medication or biological product. Reasons for this extension of oversight were driven primarily by the FDA's movement to ensure that patients and providers are better informed of drug therapies and their specific benefits and risks prior to initiation. This article provides an historical perspective of the evolution of medication risk management policy and includes a review of REMS programs, an assessment of the positive and negative aspects of REMS, and provides suggestions for planning and measuring outcomes. In particular, this publication presents an overview of the evolution of the REMS program and its implications. PMID:23113627

  8. A quantitative evaluation of models for Aegean crustal deformation

    NASA Astrophysics Data System (ADS)

    Nyst, M.; Thatcher, W.

    2003-04-01

    Modeling studies of eastern Mediterranean tectonics show that Aegean deformation is mainly determined by WSW directed expulsion of Anatolia and SW directed extension due to roll-back of African lithosphere along the Hellenic trench. How motion is transferred across the Aegean remains a subject of debate. The two most widely used hypotheses for Aegean tectonics assert fundamentally different mechanisms. The first model describes deformation as a result of opposing rotations of two rigid microplates separated by a zone of extension. In the second model most motion is accommodated by shear on a series of dextral faults and extension on graben systems. These models make different quantitative predictions for the crustal deformation field that can be tested by a new, spatially dense GPS velocity data set. To convert the GPS data into crustal deformation parameters we use different methods to model complementary aspects of crustal deformation. We parameterize the main fault and plate boundary structures of both models and produce representations for the crustal deformation field that range from purely rigid rotations of microplates, via interacting, elastically deforming blocks separated by crustal faults to a continuous velocity gradient field. Critical evaluation of these models indicates strengths and limitations of each and suggests new measurements for further refining understanding of present-day Aegean tectonics.

  9. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  10. [Effects of calcinogenic plants--qualitative and quantitative evaluation].

    PubMed

    Mello, J R; Habermehl, G G

    1998-01-01

    Different research methods demonstrated the presence of variable quantities of Vitamin D as well as its metabolites in calcinogenic plants. Most of the experiments indicated that the active component most probably should be the metabolite 1,25 (OH)2D3 linked as a glycoside. By this research it was achieved to evaluate the presence of elements with Vitamin D-like activity in the calcinogenic plants Solanum malacoxylon, Cestrum diurnum, Trisetum flavescens and Nierembergia veitchii by testing different extracts of the above plants by oral application to rachitic chicks within the research model "Strontium added Alimentation". After the oral administration of the extracts, the serum was analysed to determine the level of the elements calcium, phosphorus and alkaline phosphatase. The results gained with chicks demonstrated the presence of substances with Vitamin D-like activity in the 4 plants. Solanum malacoxylon and Cestrum diurnum as well contained substances of hydrosoluble character with elevated activity which was indicated by the significant high levels of calcium and phosphorus combined with a reduced activity of the alkaline phosphatase. This indicated the presence of 1,25 (OH)2D3 in both plants. The hydrosoluble character of the active substance in both plants is most probably explained as a compound of the metabolite 1,25 (OH)2D3, combined as a glycoside in the position O-25 of the molecule. Nierembergia veitchii and Trisetum flavescens contained only minor concentration of elements with hydrosoluble characteristics. The results of the 4 analysed plants were evaluated quantitatively as follows: Solanum malycoxylon--82,800 IU of Vitamin D/kg, Cestrum diurnum--63,200 IU of Vitamin D/kg, Nierembergia veitchii--16,400 IU/kg and Trisetum flavescens 12,000 Vitamin D IU/kg. All concentrations are calcinogenic. PMID:9499629

  11. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  12. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  13. Evaluating Potential Health Risks in Relocatable Classrooms.

    ERIC Educational Resources Information Center

    Katchen, Mark; LaPierre, Adrienne; Charlin, Cary; Brucker, Barry; Ferguson, Paul

    2001-01-01

    Only limited data exist describing potential exposures to chemical and biological agents when using portable classrooms or outlining how to assess and reduce associated health risks. Evaluating indoor air quality involves examining ventilating rates, volatile organic compounds, and microbiologicals. Open communication among key stakeholders is…

  14. Evaluation of health risks for contaminated aquifers.

    PubMed Central

    Piver, W T; Jacobs, T L; Medina, M A

    1997-01-01

    This review focuses on progress in the development of transport models for heterogeneous contaminated aquifers, the use of predicted contaminant concentrations in groundwater for risk assessment for heterogeneous human populations, and the evaluation of aquifer remediation technologies. Major limitations and areas for continuing research for all methods presented in this review are identified. Images Figure 2. PMID:9114282

  15. Using quantitative risk information in decisions about statins: a qualitative study in a community setting

    PubMed Central

    Polak, Louisa; Green, Judith

    2015-01-01

    Background A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. Aim To understand the role of quantitative risk information in patients’ accounts of decisions about taking statins. Design and setting This was a qualitative study, with participants recruited and interviewed in community settings. Method Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Results Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as ‘necessary’ either to treat test results, or because of personalised, unequivocal advice from a doctor. Conclusion This study’s findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. PMID:25824187

  16. The value of quantitative patient preferences in regulatory benefit-risk assessment

    PubMed Central

    Egbrink, Mart oude; IJzerman, Maarten

    2014-01-01

    Quantitative patient preferences are a method to involve patients in regulatory benefit-risk assessment. Assuming preferences can be elicited, there might be multiple advantages to their use. Legal, methodological and procedural issues do however imply that preferences are currently at most part of the solution on how to best involve patients in regulatory decision making. Progress is recently made on these issues.

  17. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  18. QUANTITATIVE ASSESSMENT OF CANCER RISK FROM EXPOSURE TO DIESEL ENGINE EMISSIONS

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. his model accounted for rat-huma...

  19. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and qualitative disclosures about market risk. 229.305 Section 229.305 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION STANDARD INSTRUCTIONS FOR FILING FORMS UNDER SECURITIES ACT OF 1933, SECURITIES EXCHANGE ACT OF 1934 AND...

  20. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ...The Food and Drug Administration (FDA) is announcing the availability of a draft report entitled ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review'' (literature review report). A literature review was conducted to address a requirement of the Patient Protection and Affordable Care Act (Affordable Care Act). FDA is publishing the literature review......

  1. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  2. Use of global sensitivity analysis in quantitative microbial risk assessment: application to the evaluation of a biological time temperature integrator as a quality and safety indicator for cold smoked salmon.

    PubMed

    Ellouze, M; Gauchi, J-P; Augustin, J-C

    2011-06-01

    The aim of this study was to apply a global sensitivity analysis (SA) method in model simplification and to evaluate (eO)®, a biological Time Temperature Integrator (TTI) as a quality and safety indicator for cold smoked salmon (CSS). Models were thus developed to predict the evolutions of Listeria monocytogenes and the indigenous food flora in CSS and to predict TTIs endpoint. A global SA was then applied on the three models to identify the less important factors and simplify the models accordingly. Results showed that the subset of the most important factors of the three models was mainly composed of the durations and temperatures of two chill chain links, out of the control of the manufacturers: the domestic refrigerator and the retail/cabinet links. Then, the simplified versions of the three models were run with 10(4) time temperature profiles representing the variability associated to the microbial behavior, to the TTIs evolution and to the French chill chain characteristics. The results were used to assess the distributions of the microbial contaminations obtained at the TTI endpoint and at the end of the simulated profiles and proved that, in the case of poor storage conditions, the TTI use could reduce the number of unacceptable foods by 50%. PMID:21511136

  3. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited

  4. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  5. Aging rat vestibular ganglion: I. Quantitative light microscopic evaluation.

    PubMed

    Alidina, A; Lyon, M J

    1990-01-01

    This study was undertaken to quantify age-related changes in the rat vestibular ganglion. Cell number, diameter, and proximal-distal distribution based on size were evaluated. Serial 5-microns plastic sections of the vestibular ganglion from 15 female Wistar rats were examined. Rats were divided into three age groups: young (Y, 3 to 5 months, n = 5), old (0, 24 to 26 months, n = 3), and very old (VO, 28 to 31 months, n = 7). Quantitative analysis indicated no significant differences (P less than .05) in the estimated number of ganglion cells (mean: Y = 1,690, 0 = 2,257, VO = 1,678), ganglion cell profile diameters (mean: Y = 22.5 microns, n = 2,886; O = 23.7 microns, n = 2,313; VO = 22.8 microns, n = 4,061), or proximal-distal localization (proximal: 22.3 microns, 24.4 microns, 22.7 microns; middle: 22.6 microns, 23.1 microns, 22.4 microns; distal: 23.3 microns, 23.4 microns, 23.7 microns; Y, O, and VO, respectively). When pooled, the old animals tended to have slightly larger cell profiles than the other groups. We noted a dramatic age-related increase of aging pigment within the ganglion cell profiles, making the old and very old animals easily distinguishable from the young. In most of the cell profiles, the aging pigment was more or less uniformly distributed throughout the cytoplasm. However, in some, aging pigment was accumulated at one pole of the cell profile. While no typical degenerating cellular profiles were found in any of the sections, several of the ganglion cell profiles from the old animals revealed dense cytoplasm, possibly indicating an early stage of degeneration. PMID:2382785

  6. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  7. Quantitative comparison between crowd models for evacuation planning and evaluation

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.

    2014-02-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.

  8. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  9. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  10. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  11. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. PMID:26739372

  12. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  13. Longitudinal flexural mode utility in quantitative guided wave evaluation

    NASA Astrophysics Data System (ADS)

    Li, Jian

    2001-07-01

    Longitudinal Non-axisymmetric flexural mode utility in quantitative guided wave evaluation is examined for pipe and tube inspection. Attention is focused on hollow cylinders. Several source loading problems such as a partial-loading angle beam, an axisymmetric comb transducer and an angle beam array are studied. The Normal Mode Expansion method is employed to simulate the generated guided wave fields. For non-axisymmetric sources, an important angular profile feature is studied. Based on numerical calculations, an angular profile varies with frequency, mode and propagating distance. Since an angular profile determines the energy distribution of the guided waves, the angular profile has a great impact on the pipe inspection capability of guided waves. The simulation of non-axisymmetric angular profiles generated by partialloading is verified by experiments. An angular profile is the superposition of harmonic axisymmetric and non-axisymmetric modes with various phase velocities. A simpler equation is derived to calculate the phase velocities of the non-axisymmetric guided waves and is used for discussing the characteristics of non-axisymmetric guided waves. Angular profiles have many applications in practical pipe testing. The procedure of building desired angular profiles and also angular profile tuning is discussed. This angular profile tuning process is implemented by a phased transducer array and a special computational algorithm. Since a transducer array plays a critical role in guided wave inspection, the performance of a transducer array is discussed in terms of guided wave mode control ability and excitation sensitivity. With time delay inputs, a transducer array is greatly improved for its mode control ability and sensitivity. The algorithms for setting time delays are derived based on frequency, element spacing and phase velocity. With the help of the conclusions drawn on non- axisymmetric guided waves, a phased circumferential partial-loading array is

  14. Quantitative evaluation of stiffness of commercial suture materials.

    PubMed

    Chu, C C; Kizil, Z

    1989-03-01

    The bending stiffness of 22 commercial suture materials of varying size, chemical structure and physical form was quantitatively evaluated using a stiffness tester (Taber V-5, model 150B, Teledyne). The commercial sutures were Chromic catgut; Dexon (polyglycolic acid); Vicryl (polyglactin 910); PDS (polydioxanone); Maxon (polyglycolide-trimethylene carbonate); Silk (coated with silicone); Mersilene (polyester fiber); Tycron (polyester fiber); Ethibond (polyethylene terephthalate coated with polybutylene); Nurolon (nylon 66); Surgilon (nylon 66 coated with silicone); Ethilon (coated nylon 66), Prolene (polypropylene); Dermalene (polyethylene), and Gore-tex (polytetraflouroethylene). These are both natural and synthetic, absorbable and nonabsorbable and monofilament and multifilament sutures. All of these sutures were size 2-0, but Prolene sutures with sizes ranging from 1-0 to 9-0 were also tested to determine the effect of suture size on stiffness. The bending stiffness data obtained showed that a wide range of bending stiffness was observed among the 22 commercial sutures. The most flexible 2-0 suture was Gore-tex, followed by Dexon, Silk, Surgilon, Vicryl (uncoated), Tycron, Nurolon, Mersilene, Ethibond, Maxon, PDS, Ethilon, Prolene, Chromic catgut, coated Vicryl, and lastly, Dermalene. The large porous volume inherent in Gore-tex monofilament suture was the reason for its lowest flexural stiffness. Sutures with a braided structure were generally more flexible than those of a monofilament structure, irrespective of the chemical constituents. Coated sutures had significantly higher stiffness than the corresponding uncoated ones. This is particularly true when polymers rather than wax were used as the coating material. This increase in stiffness is attributable to the loss of mobility under bending force in the fibers and yarns that make up the sutures. An increase in the size of the suture significantly increased the stiffness, and the magnitude of increase

  15. Quantitative Evaluation of Atherosclerotic Plaque Using Ultrasound Tissue Characterization.

    NASA Astrophysics Data System (ADS)

    Yigiter, Ersin

    Evaluation of therapeutic methods directed toward interrupting and/or delaying atherogenesis is impeded by the lack of a reliable, non-invasive means for monitoring progression or regression of disease. The ability to characterize the predominant component of plaque may be very valuable in the study of this disease's natural history. The earlier the lesion, the more likely is lipid to be the predominant component. Progression of plaque is usually by way of overgrowth of fibrous tissues around the fatty pool. Calcification is usually a feature of the older or complicated lesion. To explore the feasibility of using ultrasound to characterize plaque we have conducted measurements of the acoustical properties of various atherosclerotic lesions found in freshly excised samples of human abdominal aorta. Our objective has been to determine whether or not the acoustical properties of plaque correlate with the type and/or chemical composition of plaque and, if so, to define a measurement scheme which could be done in-vivo and non-invasively. Our current data base consists of individual tissue samples from some 200 different aortas. Since each aorta yields between 10 to 30 tissue samples for study, we have data on some 4,468 different lesions or samples. Measurements of the acoustical properties of plaque were found to correlate well with the chemical composition of plaque. In short, measurements of impedance and attenuation seem sufficient to classify plaque as to type and to composition. Based on the in-vitro studies, the parameter of attenuation was selected as a means of classifying the plaque. For these measurements, an intravascular ultrasound scanner was modified according to our specifications. Signal processing algorithms were developed which would analyze the complex ultrasound waveforms and estimate tissue properties such as attenuation. Various methods were tried to estimate the attenuation from the pulse-echo backscattered signal. Best results were obtained by

  16. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates. PMID:27357043

  17. Quantitative, Notional, and Comprehensive Evaluations of Spontaneous Engaged Speech

    ERIC Educational Resources Information Center

    Molholt, Garry; Cabrera, Maria Jose; Kumar, V. K.; Thompsen, Philip

    2011-01-01

    This study provides specific evidence regarding the extent to which quantitative measures, common sense notional measures, and comprehensive measures adequately characterize spontaneous, although engaged, speech. As such, the study contributes to the growing body of literature describing the current limits of automatic systems for evaluating…

  18. QUANTITATIVE GENETIC ACTIVITY GRAPHICAL PROFILES FOR USE IN CHEMICAL EVALUATION

    EPA Science Inventory

    A graphic approach termed a Genetic Activity Profile (GAP) has been developed to display a matrix of data on the genetic and related effects of selected chemical agents. he profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each...

  19. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process. PMID:21549039

  20. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: II. Risk characterization.

    PubMed

    Pouillot, Régis; Goulet, Véronique; Delignette-Muller, Marie Laure; Mahé, Aurélie; Cornu, Marie

    2009-06-01

    A model for the assessment of exposure to Listeria monocytogenes from cold-smoked salmon consumption in France was presented in the first of this pair of articles (Pouillot et al., 2007, Risk Analysis, 27:683-700). In the present study, the exposure model output was combined with an internationally accepted hazard characterization model, adapted to the French situation, to assess the risk of invasive listeriosis from cold-smoked salmon consumption in France in a second-order Monte Carlo simulation framework. The annual number of cases of invasive listeriosis due to cold-smoked salmon consumption in France is estimated to be 307, with a very large credible interval ([10; 12,453]), reflecting data uncertainty. This uncertainty is mainly associated with the dose-response model. Despite the significant uncertainty associated with the predictions, this model provides a scientific base for risk managers and food business operators to manage the risk linked to cold-smoked salmon contaminated with L. monocytogenes. Under the modeling assumptions, risk would be efficiently reduced through a decrease in the prevalence of L. monocytogenes or better control of the last steps of the cold chain (shorter and/or colder storage during the consumer step), whereas reduction of the initial contamination levels of the contaminated products and improvement in the first steps of the cold chain do not seem to be promising strategies. An attempt to apply the recent risk-based concept of FSO (food safety objective) on this example underlines the ambiguity in practical implementation of the risk management metrics and the need for further elaboration on these concepts. PMID:19220799

  1. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  2. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. PMID:26227185

  3. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  4. Evaluating risk propensity using an objective instrument.

    PubMed

    Sueiro Abad, Manuel J; Sánchez-Iglesias, Ivan; Moncayo de Tella, Alejandra

    2011-05-01

    Risk propensity is the stable tendency to choose options with a lower probability of success, but greater rewards. Its evaluation has been approached from various perspectives: from self-report questionnaires to objective tests. Self-report questionnaires have often been criticized due to interference from voluntary and involuntary biases, in addition to their lack of predictive value. Objective tests, on the other hand, require resources that make them difficult to administer to large samples. This paper presents an easy-to-administer, 30-item risk propensity test. Each item is itself an objective test describing a hypothetical situation in which the subject must choose between three options, each with a different gain function but equivalent in expected value. To assess its psychometric fit, the questionnaire was administered to 222 subjects, and we performed a test of its reliability as well as exploratory factor analysis. The results supported a three-factor model of risk (Sports and Gambling, Long-term Plans, and Loss Management). After making the necessary adjustments and incorporating a global factor of risk propensity, confirmatory factor analysis was done, revealing that the data exhibited adequate goodness of fit. PMID:21568196

  5. Quantitative risk assessment of Listeriosis due to consumption of raw milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objectives of this study were to estimate the risk of illnesses for raw milk consumers due to L. monocytogenes contamination in raw milk sold by permitted raw milk dealers, and the risk of listeriosis for people on farms who consume raw milk. Three scenarios were evaluated for raw milk sold by ...

  6. A quantitative assessment of risks of heavy metal residues in laundered shop towels and their use by workers.

    PubMed

    Connor, Kevin; Magee, Brian

    2014-10-01

    This paper presents a risk assessment of exposure to metal residues in laundered shop towels by workers. The concentrations of 27 metals measured in a synthetic sweat leachate were used to estimate the releasable quantity of metals which could be transferred to workers' skin. Worker exposure was evaluated quantitatively with an exposure model that focused on towel-to-hand transfer and subsequent hand-to-food or -mouth transfers. The exposure model was based on conservative, but reasonable assumptions regarding towel use and default exposure factor values from the published literature or regulatory guidance. Transfer coefficients were derived from studies representative of the exposures to towel users. Contact frequencies were based on assumed high-end use of shop towels, but constrained by a theoretical maximum dermal loading. The risk estimates for workers developed for all metals were below applicable regulatory risk benchmarks. The risk assessment for lead utilized the Adult Lead Model and concluded that predicted lead intakes do not constitute a significant health hazard based on potential worker exposures. Uncertainties are discussed in relation to the overall confidence in the exposure estimates developed for each exposure pathway and the likelihood that the exposure model is under- or overestimating worker exposures and risk. PMID:24973502

  7. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  8. [Quantitative evaluation of the nitroblue tetrazolium reduction test].

    PubMed

    Vagner, V K; Nasonkin, O S; Boriskina, N D

    1989-01-01

    The results of NBT test were assessed by the visual cytochemical method and by the quantitative spectrophotometry technique developed by the authors for the NBT test. The results demonstrate a higher sensitivity and informative value of the new method, in case the neutrophilic tetrazolium activity is rather high; this recommends the NBT test spectrophotometric variant for wide clinical application in studies of the blood leukocyte functional and metabolic activity. PMID:2483198

  9. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  10. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible

  11. Quantitative ultrasound does not identify patients with an inflammatory disease at risk of vertebral deformities

    PubMed Central

    Heijckmann, A Caroline; Dumitrescu, Bianca; Nieuwenhuijzen Kruseman, Arie C; Geusens, Piet; Wolffenbuttel, Bruce HR; De Vries, Jolanda; Drent, Marjolein; Huijberts, Maya SP

    2008-01-01

    Background Previous studies from our group have shown that a high prevalence of vertebral deformities suggestive of fracture can be found in patients with an inflammatory disease, despite a near normal bone mineral density (BMD). As quantitative ultrasound (QUS) of the heel can be used for refined assessment of bone strength, we evaluated whether QUS can be used to identify subjects with an inflammatory disease with an increased chance of having a vertebral fracture. Methods 246 patients (mean age: 44 ± 12.4 years) with an inflammatory disease (sarcoidosis or inflammatory bowel disease (IBD)) were studied. QUS of the heel and BMD of the hip (by dual X-ray absorptiometry (DXA)) were measured. Furthermore lateral single energy densitometry of the spine for assessment of vertebral deformities was done. Logistic regression analysis was performed to assess the strength of association between the prevalence of a vertebral deformity and BMD and QUS parameters, adjusted for gender and age. Results Vertebral deformities (ratio of <0.80) were found in 72 vertebrae of 54 subjects (22%). In contrast to the QUS parameters BUA (broadband ultrasound attenuation) and SOS (speed of sound), T-score of QUS and T-scores of the femoral neck and trochanter (DXA) were lower in the group of patients with vertebral deformities. Logistic regression analysis showed that the vertebral deformity risk increases by about 60 to 90% per 1 SD reduction of BMD (T-score) determined with DXA but not with QUS. Conclusion Our findings imply that QUS measurements of the calcaneus in patients with an inflammatory condition, such as sarcoidosis and IBD, are likely of limited value to identify patients with a vertebral fracture. PMID:18492278

  12. Evaluation of the "Respect Not Risk" Firearm Safety Lesson for 3rd-Graders

    ERIC Educational Resources Information Center

    Liller, Karen D.; Perrin, Karen; Nearns, Jodi; Pesce, Karen; Crane, Nancy B.; Gonzalez, Robin R.

    2003-01-01

    The purpose of this study was to evaluate the MORE HEALTH "Respect Not Risk" Firearm Safety Lesson for 3rd-graders in Pinellas County, Florida. Six schools representative of various socioeconomic levels were selected as the test sites. Qualitative and quantitative data were collected. A total of 433 matched pretests/posttests were used to…

  13. EVALUATION OF PHYSIOLOGY COMPUTER MODELS, AND THE FEASIBILITY OF THEIR USE IN RISK ASSESSMENT.

    EPA Science Inventory

    This project will evaluate the current state of quantitative models that simulate physiological processes, and the how these models might be used in conjunction with the current use of PBPK and BBDR models in risk assessment. The work will include a literature search to identify...

  14. Evaluation of residue drum storage safety risks

    SciTech Connect

    Conner, W.V.

    1994-06-17

    A study was conducted to determine if any potential safety problems exist in the residue drum backlog at the Rocky Flats Plant. Plutonium residues stored in 55-gallon drums were packaged for short-term storage until the residues could be processed for plutonium recovery. These residues have now been determined by the Department of Energy to be waste materials, and the residues will remain in storage until plans for disposal of the material can be developed. The packaging configurations which were safe for short-term storage may not be safe for long-term storage. Interviews with Rocky Flats personnel involved with packaging the residues reveal that more than one packaging configuration was used for some of the residues. A tabulation of packaging configurations was developed based on the information obtained from the interviews. A number of potential safety problems were identified during this study, including hydrogen generation from some residues and residue packaging materials, contamination containment loss, metal residue packaging container corrosion, and pyrophoric plutonium compound formation. Risk factors were developed for evaluating the risk potential of the various residue categories, and the residues in storage at Rocky Flats were ranked by risk potential. Preliminary drum head space gas sampling studies have demonstrated the potential for formation of flammable hydrogen-oxygen mixtures in some residue drums.

  15. Designs for Risk Evaluation and Management

    SciTech Connect

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.

  16. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  17. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates. PMID:23508143

  18. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    PubMed

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health. PMID:27065414

  19. Evaluation of reference genes for quantitative RT-PCR in Lolium perenne

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative real-time RT-PCR provides an important tool for analyzing gene expression if proper internal standards are used. The aim of this study was to identify and evaluate reference genes for use in real-time quantitative RT-PCR in perennial ryegrass (Lolium perenne L.) during plant developmen...

  20. Anesthesia and the quantitative evaluation of neurovascular coupling

    PubMed Central

    Masamoto, Kazuto; Kanno, Iwao

    2012-01-01

    Anesthesia has broad actions that include changing neuronal excitability, vascular reactivity, and other baseline physiologies and eventually modifies the neurovascular coupling relationship. Here, we review the effects of anesthesia on the spatial propagation, temporal dynamics, and quantitative relationship between the neural and vascular responses to cortical stimulation. Previous studies have shown that the onset latency of evoked cerebral blood flow (CBF) changes is relatively consistent across anesthesia conditions compared with variations in the time-to-peak. This finding indicates that the mechanism of vasodilation onset is less dependent on anesthesia interference, while vasodilation dynamics are subject to this interference. The quantitative coupling relationship is largely influenced by the type and dosage of anesthesia, including the actions on neural processing, vasoactive signal transmission, and vascular reactivity. The effects of anesthesia on the spatial gap between the neural and vascular response regions are not fully understood and require further attention to elucidate the mechanism of vascular control of CBF supply to the underlying focal and surrounding neural activity. The in-depth understanding of the anesthesia actions on neurovascular elements allows for better decision-making regarding the anesthetics used in specific models for neurovascular experiments and may also help elucidate the signal source issues in hemodynamic-based neuroimaging techniques. PMID:22510601

  1. Benchmarking on the evaluation of major accident-related risk assessment.

    PubMed

    Fabbri, Luciano; Contini, Sergio

    2009-03-15

    This paper summarises the main results of a European project BEQUAR (Benchmarking Exercise in Quantitative Area Risk Assessment in Central and Eastern European Countries). This project is among the first attempts to explore how independent evaluations of the same risk study associated with a certain chemical establishment could differ from each other and the consequent effects on the resulting area risk estimate. The exercise specifically aimed at exploring the manner and degree to which independent experts may disagree on the interpretation of quantitative risk assessments for the same entity. The project first compared the results of a number of independent expert evaluations of a quantitative risk assessment study for the same reference chemical establishment. This effort was then followed by a study of the impact of the different interpretations on the estimate of the overall risk on the area concerned. In order to improve the inter-comparability of the results, this exercise was conducted using a single tool for area risk assessment based on the ARIPAR methodology. The results of this study are expected to contribute to an improved understanding of the inspection criteria and practices used by the different national authorities responsible for the implementation of the Seveso II Directive in their countries. The activity was funded under the Enlargement and Integration Action of the Joint Research Centre (JRC), that aims at providing scientific and technological support for promoting integration of the New Member States and assisting the Candidate Countries on their way towards accession to the European Union. PMID:18657363

  2. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  3. Risk Evaluation, Driving, and Adolescents: A Typology.

    ERIC Educational Resources Information Center

    Harre, Niki

    2000-01-01

    Presents a typology outlining five psychological risk states that may be experienced by adolescent drivers. Identifies the habitually cautious driving and active risk avoidance states as desirable from a traffic safety viewpoint. Identifies reduced risk perception, acceptance of risk at a cost, and risk seeking states as undesirable. Examines…

  4. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  5. A New Simple Interferometer for Obtaining Quantitatively Evaluable Flow Patterns

    NASA Technical Reports Server (NTRS)

    Erdmann, S F

    1953-01-01

    The method described in the present report makes it possible to obtain interferometer records with the aid of any one of the available Schlieren optics by the addition of very simple expedients, which fundamentally need not to be inferior to those obtained by other methods, such as the Mach-Zehnder interferometer, for example. The method is based on the fundamental concept of the phase-contrast process developed by Zernike, but which in principle has been enlarged to such an extent that it practically represents an independent interference method for general applications. Moreover, the method offers the possibility, in case of necessity, of superposing any apparent wedge field on the density field to be gauged. The theory is explained on a purely physical basis and illustrated and proved by experimental data. A number of typical cases are cited and some quantitative results reported.

  6. Quantitative Percussion Diagnostics For Evaluating Bond Integrity Between Composite Laminates

    NASA Astrophysics Data System (ADS)

    Poveromo, Scott Leonard

    Conventional nondestructive testing (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was utilized based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Experimental results indicate that this technology is capable of detecting 'kiss' bonds (very low adhesive shear strength), caused by the application of release agents on the bonding surfaces, between flat composite laminates bonded together with epoxy adhesive. Specifically, the local value of the loss coefficient determined from quantitative percussion testing was found to be significantly greater for a release coated panel compared to that for a well bonded sample. Also, the local value of the probe force or force returned to the probe after impact was observed to be lower for the release coated panels. The increase in loss coefficient and decrease in probe force are thought to be due to greater internal friction during the percussion event for poorly bonded specimens. NDT standards were also fabricated by varying the cure parameters of an epoxy film adhesive. Results from QPD for the variable cure NDT standards and lap shear strength measurements taken of mechanical test specimens were compared and analyzed. Finally, experimental results have been compared to a finite element analysis to understand the visco-elastic behavior of the laminates during percussion testing. This comparison shows how a lower quality bond leads to a reduction in the percussion force by biasing strain in the percussion tested side of the panel.

  7. The Children, Youth, and Families at Risk (CYFAR) Evaluation Collaboration.

    ERIC Educational Resources Information Center

    Marek, Lydia I.; Byrne, Richard A. W.; Marczak, Mary S.; Betts, Sherry C.; Mancini, Jay A.

    1999-01-01

    The Cooperative Extension Service's Children, Youth, and Families at Risk initiative is being assessed by the Evaluation Collaboration's three projects: state-strengthening evaluation project (resources to help states evaluate community programs); NetCon (evaluation of electronic and other networks); and National Youth at Risk Sustainability Study…

  8. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  9. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

  10. Designs for Risk Evaluation and Management

    Energy Science and Technology Software Center (ESTSC)

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakagemore » simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less

  11. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  12. A lighting metric for quantitative evaluation of accent lighting systems

    NASA Astrophysics Data System (ADS)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  13. Range sensors on marble surfaces: quantitative evaluation of artifacts

    NASA Astrophysics Data System (ADS)

    Guidi, Gabriele; Remondino, Fabio; Russo, Michele; Spinetti, Alessandro

    2009-08-01

    While 3D imaging systems are widely available and used, clear statements about the possible influence of material properties over the acquired geometrical data are still rather few. In particular a material very often used in Cultural Heritage is marble, known to give geometrical errors with range sensor technologies and whose entity reported in the literature seems to vary considerably in the different works. In this article a deep investigation with different types of active range sensors used on four types of marble surfaces, has been performed. Two triangulation-based active sensors employing laser stripe and white light pattern projection respectively, and one PW-TOF laser scanner have been used in the experimentation. The analysis gave rather different results for the two categories of instruments. A negligible light penetration came out from the triangulation-based equipment (below 50 microns with the laser stripe and even less with the pattern projection device), while with the TOF system this came out to be two orders of magnitude larger, quantitatively evidencing a source of systematic errors that any surveyor engaged in 3D scanning of Cultural Heritage sites and objects should take into account and correct.

  14. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  15. Predictive Heterosis in Multibreed Evaluations Using Quantitative and Molecular Approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Heterosis is the extra genetic boost in performance obtained by crossing two cattle breeds. It is an important tool for increasing the efficiency of beef production. It is also important to adjust data used to calculate genetic evaluations for differences in heterosis. Good estimates of heterosis...

  16. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  17. A quantitative evaluation of the AVITEWRITE model of handwriting learning.

    PubMed

    Paine, R W; Grossberg, S; Van Gemmert, A W A

    2004-12-01

    Much sensory-motor behavior develops through imitation, as during the learning of handwriting by children. Such complex sequential acts are broken down into distinct motor control synergies, or muscle groups, whose activities overlap in time to generate continuous, curved movements that obey an inverse relation between curvature and speed. The adaptive vector integration to endpoint handwriting (AVITEWRITE) model of Grossberg and Paine (2000) [A neural model of corticocerebellar interactions during attentive imitation and predictive learning of sequential handwriting movements. Neural Networks, 13, 999-1046] addressed how such complex movements may be learned through attentive imitation. The model suggested how parietal and motor cortical mechanisms, such as difference vector encoding, interact with adaptively-timed, predictive cerebellar learning during movement imitation and predictive performance. Key psychophysical and neural data about learning to make curved movements were simulated, including a decrease in writing time as learning progresses; generation of unimodal, bell-shaped velocity profiles for each movement synergy; size scaling with isochrony, and speed scaling with preservation of the letter shape and the shapes of the velocity profiles; an inverse relation between curvature and tangential velocity; and a two-thirds power law relation between angular velocity and curvature. However, the model learned from letter trajectories of only one subject, and only qualitative kinematic comparisons were made with previously published human data. The present work describes a quantitative test of AVITEWRITE through direct comparison of a corpus of human handwriting data with the model's performance when it learns by tracing the human trajectories. The results show that model performance was variable across the subjects, with an average correlation between the model and human data of 0.89+/-0.10. The present data from simulations using the AVITEWRITE model

  18. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  19. Photoacoustic microscopy for quantitative evaluation of angiogenesis inhibitor

    NASA Astrophysics Data System (ADS)

    Chen, Sung-Liang; Burnett, Joseph; Sun, Duxin; Xie, Zhixing; Wang, Xueding

    2014-03-01

    We present the photoacoustic microscopy (PAM) for evaluation of angiogenesis inhibitors on a chick embryo model. Microvasculature in the chorioallantoic membrane (CAM) of the chick embryos was imaged by PAM, and the optical microscopy (OM) images of the same set of CAMs were also acquired for comparisons, serving for validation of the results from PAM. The angiogenesis inhibitors, Sunitinib, with different concentrations applied to the CAM result in the change in microvascular density, which was quantified by both PAM and OM imaging. Similar change in microvascular density from PAM and OM imaging in response to angiogenesis inhibitor at different doses was observed, demonstrating that PAM has potential to provide objective evaluation of anti-angiogenesis medication. Besides, PAM is advantageous in three-dimensional and functional imaging compared with OM so that the emerging PAM technique may offer unique information on the efficacy of angiogenesis inhibitors and could benefit applications related to antiangiogenesis treatments.

  20. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  1. Quantitative evaluation of phonetograms in the case of functional dysphonia.

    PubMed

    Airainer, R; Klingholz, F

    1993-06-01

    According to the laryngeal clinical findings, figures making up a scale were assigned to vocally trained and vocally untrained persons suffering from different types of functional dysphonia. The different types of dysphonia--from the manifested hypofunctional to the extreme hyperfunctional dysphonia--were classified by means of this scale. Besides, the subjects' phonetograms were measured and approximated by three ellipses, what rendered possible the definition of phonetogram parameters. The combining of selected phonetogram parameters to linear combinations served the purpose of a phonetographic evaluation. The linear combinations were to bring phonetographic and clinical evaluations into correspondence as accurately as possible. It was necessary to use different kinds of linear combinations for male and female singers and nonsingers. As a result of the reclassification of 71 and the new classification of 89 patients, it was possible to graduate the types of functional dysphonia by means of computer-aided phonetogram evaluation with a clinically acceptable error rate. This method proved to be an important supplement to the conventional diagnostics of functional dysphonia. PMID:8353627

  2. How does the general public evaluate risk information? The impact of associations with other risks.

    PubMed

    Visschers, Vivianne H M; Meertens, Ree M; Passchier, Wim F; Devries, Nanne K

    2007-06-01

    There is a considerable body of knowledge about the way people perceive risks using heuristics and qualitative characteristics, and about how risk information should be communicated to the public. However, little is known about the way people use the perception of known risks (associated risks) to judge an unknown risk. In a first, qualitative study, six different risks were discussed in in-depth interviews and focus group interviews. The interviews showed that risk associations played a prominent role in forming risk perceptions. Associated risks were often mentioned spontaneously. Second, a survey study was conducted to confirm the importance of risk associations quantitatively. This study investigated whether people related unknown risks to known risks. This was indeed confirmed. Furthermore, some insight was gained into how and why people form risk associations. Results showed that the semantic category of the unknown risks was more important in forming associations than the perceived level of risk or specific risk characteristics. These findings were in line with the semantic network theory. Based on these two studies, we recommend using the mental models approach in developing new risk communications. PMID:17640218

  3. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  4. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    NASA Astrophysics Data System (ADS)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  5. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  6. RISK MANAGEMENT EVALUATION FOR CONCENTRATED ANIMAL FEEDING OPERATIONS

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) developed a Risk Management Evaluation (RME) to provide information needed to help plan future research in the Laboratory dealing with the environmental impact of concentrated animal feeding operations (CAFOs). Agriculture...

  7. Consistencies and inconsistencies underlying the quantitative assessment of leukemia risk from benzene exposure

    SciTech Connect

    Lamm, S.H.; Walters, A.S. ); Wilson, R. ); Byrd, D.M. ); Grunwald, H. )

    1989-07-01

    This paper examines recent risk assessments for benzene and observes a number of inconsistencies within the study and consistencies between studies that should effect the quantitative determination of the risk from benzene exposure. Comparisons across studies show that only acute myeloid leukemia (AML) is found to be consistently in excess with significant benzene exposure. The data from the Pliofilm study that forms the basis of most quantitative assessments reveal that all the AML cases came from only one of the three studied plants and that all the benzene exposure data came from the other plants. Hematological data from the 1940s from the plant from which almost all of the industrial hygiene exposure data come do not correlate well with the originally published exposure estimates but do correlate well with an alternative set of exposure estimates that are much greater than those estimates originally published. Temporal relationships within the study are not consistent with those of other studies. The dose-response relationship is strongly nonlinear. Other data suggest that the leukemogenic effect of benzene is nonlinear and may derive from a threshold toxicity.

  8. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  9. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R; Murphy, Stephen M; Day, Robert H; Bence, A Edward; Neff, Jerry M; Wiens, John A

    2012-03-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001-2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400-4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  10. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  11. Quantitative Evaluation of the Reticuloendothelial System Function with Dynamic MRI

    PubMed Central

    Liu, Ting; Choi, Hoon; Zhou, Rong; Chen, I-Wei

    2014-01-01

    Purpose To evaluate the reticuloendothelial system (RES) function by real-time imaging blood clearance as well as hepatic uptake of superparamagnetic iron oxide nanoparticle (SPIO) using dynamic magnetic resonance imaging (MRI) with two-compartment pharmacokinetic modeling. Materials and Methods Kinetics of blood clearance and hepatic accumulation were recorded in young adult male 01b74 athymic nude mice by dynamic T2* weighted MRI after the injection of different doses of SPIO nanoparticles (0.5, 3 or 10 mg Fe/kg). Association parameter, Kin, dissociation parameter, Kout, and elimination constant, Ke, derived from dynamic data with two-compartment model, were used to describe active binding to Kupffer cells and extrahepatic clearance. The clodrosome and liposome were utilized to deplete macrophages and block the RES function to evaluate the capability of the kinetic parameters for investigation of macrophage function and density. Results The two-compartment model provided a good description for all data and showed a low sum squared residual for all mice (0.27±0.03). A lower Kin, a lower Kout and a lower Ke were found after clodrosome treatment, whereas a lower Kin, a higher Kout and a lower Ke were observed after liposome treatment in comparison to saline treatment (P<0.005). Conclusion Dynamic SPIO-enhanced MR imaging with two-compartment modeling can provide information on RES function on both a cell number and receptor function level. PMID:25090653

  12. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  13. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  14. GWAS implicates a role for quantitative immune traits and threshold effects in risk for human autoimmune disorders

    PubMed Central

    Gregersen, Peter K.; Diamond, Betty; Plenge, Robert M.

    2016-01-01

    Genome wide association studies in human autoimmune disorders has provided a long list of alleles with rather modest degrees of risk. A large fraction of these associations are likely due to either quantitative differences in gene expression or amino acid changes that regulate quantitative aspects of the immune response. While functional studies are still lacking for most of these associations, we present examples of autoimmune disease risk alleles that influence quantitative changes in lymphocyte activation, cytokine signaling and dendritic cell function. The analysis of immune quantitative traits associated with autoimmune loci is clearly going to be an important component of understanding the pathogenesis of autoimmunity. This will require both new and more efficient ways of characterizing the normal immune system, as well as large population resources with which genotype-phenotype correlations can be convincingly demonstrated. Future development of new therapies will depend on understanding the mechanistic underpinnings of immune regulation by these new risk loci. PMID:23026397

  15. A quantitative methodology to assess the risks to human health from CO2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, E.; Sitchler, A.; Maxwell, R. M.; McCray, J. E.

    2010-12-01

    Leakage of CO2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO2 leakage into drinking water aquifers. This framework incorporates the potential release of CO2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding given greater toxicity of lead at lower doses than arsenic. It was also found that higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and

  16. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  17. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. PMID:25951756

  18. Improved field experimental designs and quantitative evaluation of aquatic ecosystems

    SciTech Connect

    McKenzie, D.H.; Thomas, J.M.

    1984-05-01

    The paired-station concept and a log transformed analysis of variance were used as methods to evaluate zooplankton density data collected during five years at an electrical generation station on Lake Michigan. To discuss the example and the field design necessary for a valid statistical analysis, considerable background is provided on the questions of selecting (1) sampling station pairs, (2) experimentwise error rates for multi-species analyses, (3) levels of Type I and II error rates, (4) procedures for conducting the field monitoring program, and (5) a discussion of the consequences of violating statistical assumptions. Details for estimating sample sizes necessary to detect changes of a specified magnitude are included. Both statistical and biological problems with monitoring programs (as now conducted) are addressed; serial correlation of successive observations in the time series obtained was identified as one principal statistical difficulty. The procedure reduces this problem to a level where statistical methods can be used confidently. 27 references, 4 figures, 2 tables.

  19. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  20. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    NASA Technical Reports Server (NTRS)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  1. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  2. What are the chances? Evaluating risk and benefit information in consumer health materials

    PubMed Central

    Burkell, Jacquelyn

    2004-01-01

    Much consumer health information addresses issues of disease risk or treatment risks and benefits, addressing questions such as “How effective is this treatment?” or “What is the likelihood that this test will give a false positive result?” Insofar as it addresses outcome likelihood, this information is essentially quantitative in nature, which is of critical importance, because quantitative information tends to be difficult to understand and therefore inaccessible to consumers. Information professionals typically examine reading level to determine the accessibility of consumer health information, but this measure does not adequately reflect the difficulty of quantitative information, including materials addressing issues of risk and benefit. As a result, different methods must be used to evaluate this type of consumer health material. There are no standard guidelines or assessment tools for this task, but research in cognitive psychology provides insight into the best ways to present risk and benefit information to promote understanding and minimize interpretation bias. This paper offers an interdisciplinary bridge that brings these results to the attention of information professionals, who can then use them to evaluate consumer health materials addressing risks and benefits. PMID:15098049

  3. Risk Evaluation of Endocrine-Disrupting Chemicals

    PubMed Central

    Gioiosa, Laura; Palanza, Paola; vom Saal, Frederick S.

    2015-01-01

    We review here our studies on early exposure to low doses of the estrogenic endocrine-disrupting chemical bisphenol A (BPA) on behavior and metabolism in CD-1 mice. Mice were exposed in utero from gestation day (GD) 11 to delivery (prenatal exposure) or via maternal milk from birth to postnatal day 7 (postnatal exposure) to 10 µg/kg body weight/d of BPA or no BPA (controls). Bisphenol A exposure resulted in long-term disruption of sexually dimorphic behaviors. Females exposed to BPA pre- and postnatally showed increased anxiety and behavioral profiles similar to control males. We also evaluated metabolic effects in prenatally exposed adult male offspring of dams fed (from GD 9 to 18) with BPA at doses ranging from 5 to 50 000 µg/kg/d. The males showed an age-related significant change in a number of metabolic indexes ranging from food intake to glucose regulation at BPA doses below the no observed adverse effect level (5000 µg/kg/d). Consistent with prior findings, low but not high BPA doses produced significant effects for many outcomes. These findings provide further evidence of the potential risks that developmental exposure to low doses of the endocrine disrupter BPA may pose to human health, with fetuses and infants being highly vulnerable. PMID:26740806

  4. EVALUATING TOOLS AND MODELS USED FOR QUANTITATIVE EXTRAPOLATION OF IN VITRO TO IN VIVO DATA FOR NEUROTOXICANTS*

    EPA Science Inventory

    There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...

  5. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  6. Quantitative evaluation of proximal contacts in posterior composite restorations. Part I. Methodology.

    PubMed

    Wang, J C; Hong, J M

    1989-07-01

    An in vivo method of quantitative measuring intertooth distance before and after placement of a Class 2 composite resin restoration has been developed. A Kaman Sciences KD-2611 non-contact displacement measuring system with a 1 U unshield sensor, based upon the variable resistance of eddy current, was used for the intraoral measurement. Quantitative evaluation of proximal wear, therefore, can be made preoperatively, postoperatively, and at subsequent recall interval for posterior composite resin restorations. PMID:2810447

  7. Assessment of Semi-Quantitative Health Risks of Exposure to Harmful Chemical Agents in the Context of Carcinogenesis in the Latex Glove Manufacturing Industry.

    PubMed

    Yari, Saeed; Fallah Asadi, Ayda; Varmazyar, Sakineh

    2016-01-01

    Excessive exposure to chemicals in the workplace can cause poisoning and various diseases. Thus, for the protection of labor, it is necessary to examine the exposure of people to chemicals and risks from these materials. The purpose of this study is to evaluate semi-quantitative health risks of exposure to harmful chemical agents in the context of carcinogenesis in a latex glove manufacturing industry. In this cross-sectional study, semi-quantitative risk assessment methods provided by the Department of Occupational Health of Singapore were used and index of LD50, carcinogenesis (ACGIH and IARC) and corrosion capacity were applied to calculate the hazard rate and the biggest index was placed as the basis of risk. To calculate the exposure rate, two exposure index methods and the actual level of exposure were employed. After identifying risks, group H (high) and E (very high) classified as high-risk were considered. Of the total of 271 only 39 (15%) were at a high risk level and 3% were very high (E). These risks only was relevant to 7 materials with only sulfuric acid placed in group E and 6 other materials in group H, including nitric acid (48.3%), chromic acid (6.9%), hydrochloric acid (10.3%), ammonia (3.4%), potassium hydroxide (20.7%) and chlorine (10.3%). Overall, the average hazard rate level was estimated to be 4 and average exposure rate to be 3.5. Health risks identified in this study showed that the manufacturing industry for latex gloves has a high level of risk because of carcinogens, acids and strong alkalisand dangerous drugs. Also according to the average level of risk impact, it is better that the safety design strategy for latex gloves production industry be placed on the agenda. PMID:27165227

  8. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  9. Quantitative Microbial Risk Assessment for Campylobacter spp. on Ham in Korea

    PubMed Central

    2015-01-01

    The objective of this study was to evaluate the risk of illness from Campylobacter spp. on ham. To identify the hazards of Campylobacter spp. on ham, the general characteristics and microbial criteria for Campylobacter spp., and campylobacteriosis outbreaks were investigated. In the exposure assessment, the prevalence of Campylobacter spp. on ham was evaluated, and the probabilistic distributions for the temperature of ham surfaces in retail markets and home refrigerators were prepared. In addition, the raw data from the Korea National Health and Nutrition Examination Survey (KNHNES) 2012 were used to estimate the consumption amount and frequency of ham. In the hazard characterization, the Beta-Poisson model for Campylobacter spp. infection was used. For risk characterization, a simulation model was developed using the collected data, and the risk of Campylobacter spp. on ham was estimated with @RISK. The Campylobacter spp. cell counts on ham samples were below the detection limit (<0.70 Log CFU/g). The daily consumption of ham was 23.93 g per person, and the consumption frequency was 11.57%. The simulated mean value of the initial contamination level of Campylobacter spp. on ham was −3.95 Log CFU/g, and the mean value of ham for probable risk per person per day was 2.20×10−12. It is considered that the risk of foodborne illness for Campylobacter spp. was low. Furthermore, these results indicate that the microbial risk assessment of Campylobacter spp. in this study should be useful in providing scientific evidence to set up the criteria of Campylobacter spp.. PMID:26761897

  10. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  11. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    PubMed

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk. PMID:26427634

  12. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    fractured reservoir, fracture propagation, fault zones and their role in regard to fluid migration into shallow aquifers). A quantitative risk assessment which should be the main aim of future work in this field has much higher demands, especially on site specific data, as the estimation of statistical parameter uncertainty requires site specific parameter distributions. There is already ongoing research on risk assessment in related fields like CO2 sequestration. We therefore propose these methodologies to be transferred to risk estimation relating to the use of the hydraulic fracking method, be it for unconventional gas or enhanced geothermal energy production. The overall aim should be to set common and transparent standards for different uses of the subsurface and their involved risks and communicate those to policy makers and stake holders.

  13. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  14. Quantitative risk estimation for a Legionella pneumophila infection due to whirlpool use.

    PubMed

    Bouwknegt, Martijn; Schijven, Jack F; Schalk, Johanna A C; de Roda Husman, Ana Maria

    2013-07-01

    Quantitative microbiological risk assessment was used to quantify the risk associated with the exposure to Legionella pneumophila in a whirlpool. Conceptually, air bubbles ascend to the surface, intercepting Legionella from the traversed water. At the surface the bubble bursts into dominantly noninhalable jet drops and inhalable film drops. Assuming that film drops carry half of the intercepted Legionella, a total of four (95% interval: 1-9) and 4.5×10(4) (4.4×10(4) - 4.7×10(4) ) cfu/min were estimated to be aerosolized for concentrations of 1 and 1,000 legionellas per liter, respectively. Using a dose-response model for guinea pigs to represent humans, infection risks for active whirlpool use with 100 cfu/L water for 15 minutes were 0.29 (∼0.11-0.48) for susceptible males and 0.22 (∼0.06-0.42) for susceptible females. A L. pneumophila concentration of ≥1,000 cfu/L water was estimated to nearly always cause an infection (mean: 0.95; 95% interval: 0.9-∼1). Estimated infection risks were time-dependent, ranging from 0.02 (0-0.11) for 1-minute exposures to 0.93 (0.86-0.97) for 2-hour exposures when the L. pneumophila concentration was 100 cfu/L water. Pool water in Dutch bathing establishments should contain <100 cfu Legionella/L water. This study suggests that stricter provisions might be required to assure adequate public health protection. PMID:23078231

  15. Risk in Enterprise Cloud Computing: Re-Evaluated

    ERIC Educational Resources Information Center

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  16. Evaluation of bone metabolism in newborn twins using quantitative ultrasound and biochemical parameters.

    PubMed

    Kara, Semra; Güzoğlu, Nilüfer; Göçer, Emine; Arıkan, Fatma Inci; Dilmen, Uğur; Dallar Bilge, Yıldız

    2016-03-01

    Metabolic bone disease (MBD) is one of the important complications of prematurity. Early and adequate nutritional interventions may reduce the incidence and potential complications of MBD. The present study aimed to evaluate bone metabolism in twins via biochemical parameters and quantitative ultrasound (QUS) and to compare the results between twin pairs. Moreover, twin infants were evaluated in terms of potential risk factors likely to have impact on MBD. Forty-three pairs of twins were included in the study. Serum calcium, phosphorus, magnesium, and alkaline phosphatase concentrations were assessed and bone mineral density was measured using QUS (speed of sound, SOS) at postnatal 30 d. Co-twin with the higher birth weight was assigned to Group 1 (n = 36) and the other twin was assigned to Group 2 (n = 36). Birth weight and head circumference were significantly higher in the infants of Group 1 compared with Group 2. No significant difference was found among the groups in terms of gender, history of resuscitation, length of stay in intensive care unit (ICU) or in the incubator, duration of total parenteral nutrition (TPN), type of nutrition, vitamin D use, biochemical parameters, and the SOS value. The factors likely to affect SOS, including type of pregnancy, maternal drug use, gender of infant, birth weight, head circumference at birth, gestational week, length of stay at the ICU, duration of TPN, type of nutrition, resuscitation, vitamin D use, and levels of calcium, phosphorus, magnesium, and alkaline phosphatase were entered into the model. The phosphorus level and the maternal drug use were found to be the factors that significantly reduced SOS, whereas pregnancy after assisted reproductive techniques was found to be a significant enhancing factor. PMID:25777793

  17. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis. PMID:20961039

  18. Retrospective analysis of a listeria monocytogenes contamination episode in raw milk goat cheese using quantitative microbial risk assessment tools.

    PubMed

    Delhalle, L; Ellouze, M; Yde, M; Clinquart, A; Daube, G; Korsak, N

    2012-12-01

    In 2005, the Belgian authorities reported a Listeria monocytogenes contamination episode in cheese made from raw goat's milk. The presence of an asymptomatic shedder goat in the herd caused this contamination. On the basis of data collected at the time of the episode, a retrospective study was performed using an exposure assessment model covering the production chain from the milking of goats up to delivery of cheese to the market. Predictive microbiology models were used to simulate the growth of L. monocytogenes during the cheese process in relation with temperature, pH, and water activity. The model showed significant growth of L. monocytogenes during chilling and storage of the milk collected the day before the cheese production (median increase of 2.2 log CFU/ml) and during the addition of starter and rennet to milk (median increase of 1.2 log CFU/ml). The L. monocytogenes concentration in the fresh unripened cheese was estimated to be 3.8 log CFU/g (median). This result is consistent with the number of L. monocytogenes in the fresh cheese (3.6 log CFU/g) reported during the cheese contamination episode. A variance-based method sensitivity analysis identified the most important factors impacting the cheese contamination, and a scenario analysis then evaluated several options for risk mitigation. Thus, by using quantitative microbial risk assessment tools, this study provides reliable information to identify and control critical steps in a local production chain of cheese made from raw goat's milk. PMID:23212008

  19. Quantitative phase evaluation of dynamic changes on cell membrane during laser microsurgery.

    PubMed

    Yu, Lingfeng; Mohanty, Samarendra; Liu, Gangjun; Genc, Suzanne; Chen, Zhongping; Berns, Michael W

    2008-01-01

    The ability to inject exogenous material as well as to alter subcellular structures in a minimally invasive manner using a laser microbeam has been useful for cell biologists to study the structure-function relationship in complex biological systems. We describe a quantitative phase laser microsurgery system, which takes advantage of the combination of laser microirradiation and short-coherence interference microscopy. Using this method, quantitative phase images and the dynamic changes of phase during the process of laser microsurgery of red blood cells (RBCs) can be evaluated in real time. This system would enable absolute quantitation of localized alteration/damage to transparent phase objects, such as the cell membrane or intracellular structures, being exposed to the laser microbeam. Such quantitation was not possible using conventional phase-contrast microscopy. PMID:19021378

  20. Evaluation of Chair-Side Assays in High Microbiological Caries-Risk Subjects.

    PubMed

    Saravia, Marta Estela; Silva, Lea Assed Bezerra; Silva, Raquel Assed Bezerra; Lucisano, Marília Pacífico; Echevarría, Andrea Uribe; Echevarría, Jorge Uribe; Nelson-Filho, Paulo

    2015-01-01

    The aim of this study was to evaluate the commercial chair-side assays Saliva-Check Mutans and ClinproTM Cario L-PopTM in high microbiological caries-risk dental students compared with conventional semi-quantitative colony counting culture-based technique as the reference method. Saliva samples from 93 subjects of both sexes aged 18-26 years were seeded (Köhler and Bratthall method) on plates containing SB-20M culture medium method and 12 subjects with high caries risk were selected. These 12 individuals were subjected to determination of caries risk using two commercial rapid detection chair-side assays (Saliva-Check Mutans and ClinproTM Cario L-PopTM) according to the manufacturers' instructions. The results were analyzed by the Kappa correlation test using SAS statistical software. There was a perfect agreement (Kappa=1) among the three caries risk evaluation methods - chair-side assays and semi-quantitative CFU count (control) - in all subjects. The results suggest that the commercial chair-side assays evaluated in this study may be practical and useful to identify high microbiological caries-risk subjects. PMID:26963201

  1. Modelling public risk evaluation of natural hazards: a conceptual approach

    NASA Astrophysics Data System (ADS)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  2. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    PubMed Central

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-01-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output. PMID:26430292

  3. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  4. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  5. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  6. Quantitative risk assessment for lung cancer from exposure to metal ore dust

    SciTech Connect

    Fu, H.; Jing, X.; Yu, S.; Gu, X.; Wu, K.; Yang, J.; Qiu, S. )

    1992-09-01

    To quantitatively assess risk for lung cancer of metal miners, a historical cohort study was conducted. The cohort consisted of 1113 miners who were employed to underground work for at least 12 months between January 1, 1960 and December 12, 1974. According to the records of dust concentration, a cumulative dust dose of each miner in the cohort was estimated. There were 162 deaths in total and 45 deaths from lung cancer with a SMR of 2184. The SMR for lung cancer increased from 1019 for those with cumulative dust dose of less than 500 mg-year to 2469 for those with the dose of greater than 4500 mg-year. Furthermore, the risk in the highest category of combined cumulative dust dose and cigarette smoking was 46-fold greater than the lowest category of dust dose and smoking. This study showed that there was an exposure-response relationship between metal ore dust and lung cancer, and an interaction of lung cancer between smoking and metal ore dust exposure.

  7. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a "tool chest" of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of t...

  8. Approaches for assessing risks to sensitive populations: Lessons learned from evaluating risks in the pediatric populations*

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a 'tool chest' of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of...

  9. Evaluating Risk Of Failure With Limited Information

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Creager, M.; Newlin, L. E.; Sutharshana, S.

    1993-01-01

    Report describes probabilistic failure assessment (PFA). Developed for application to spaceflight systems for sufficient testing of hardware to ensure reliability not feasible. However, must be ascertained that critical failure modes extremely unlikely to occur during service. PFA applied to any failure mode described by quantitative models of physics and mechanics of failure phenomena, such as fatigue crack in initiation or propagation in structures, leakage of seals, wear in bearings, and erosion of arcjet thrustor cathodes.

  10. Quantitative approach of risk management strategies for hepatitis a virus-contaminated oyster production areas.

    PubMed

    Thebault, A; Le Saux, J-C; Pommepuy, M; Le Guyader, S; Lailler, R; Denis, J-B

    2012-07-01

    It is not yet known whether using the new molecular tools to monitor hepatitis A virus (HAV) in shellfish production areas could be useful for improving food safety. HAV contamination can be acute in coastal areas, such as Brittany, France, where outbreaks of hepatitis A have already occurred and have been linked to the consumption of raw shellfish. A quantitative probabilistic approach was carried out to estimate the mean annual risk of hepatitis A in an adult population of raw oyster consumers. Two hypothetical scenarios of contamination were considered, the first for a rare and brief event and the second for regular and prolonged episodes of contamination. Fourteen monitoring and management strategies were simulated. Their effects were assessed by the relative risk reduction in mean annual risk. The duration of closure after abnormal detection in the shellfish area was also considered. Among the strategies tested, results show that monthly molecular reverse transcription PCR monitoring of HAV is more useful than bacterial surveys. In terms of management measures, early closure of the shellfish area without waiting for confirmatory analysis was shown to be the most efficient strategy. When contamination is very short-lived and homogeneous in the shellfish production area, waiting for three negative results before reopening the area for harvest is time wasting. When contamination is not well identified or if contamination is heterogeneous, it can be harmful not to wait for three negative results. In addition, any preventive measures, such as improving sewage treatment or producing shellfish in safer areas, that can reduce contamination by at least 2 log units are more efficient and less costly. Finally we show that controlling and managing transferred shellfish are useful and can play an important role in preventing cases. Qualitative results from HAV monitoring can advantageously supplement other measures that improve the safety of shellfish products in exposed

  11. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration. PMID:27002672

  12. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  13. Evaluation of reference genes for quantitative RT-PCR in Lolium temulentum under abiotic stress

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Lolium temulentum is a valuable model grass species for the study of stress in forage and turf grasses. Gene expression analysis by quantitative real time RT-PCR relies on the use of proper internal standards. The aim of this study was to identify and evaluate reference genes for use in real-time q...

  14. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  15. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  16. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  17. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  18. Toward Web-Site Quantitative Evaluation: Defining Quality Characteristics and Attributes.

    ERIC Educational Resources Information Center

    Olsina, L; Rossi, G.

    This paper identifies World Wide Web site characteristics and attributes and groups them in a hierarchy. The primary goal is to classify the elements that might be part of a quantitative evaluation and comparison process. In order to effectively select quality characteristics, different users' needs and behaviors are considered. Following an…

  19. Interdisciplinary program for quantitative nondestructive evaluation. Semi-annual report, October 1, 1982-February 28, 1983

    SciTech Connect

    Not Available

    1983-01-01

    Separate abstracts were prepared for the papers published in the following areas: (1) Application of Ultrasonic Quantitative Nondestructive Evaluation to Radio Frequency System Window Problems, (a) Improvements in Probability of Detection and (b) Sizing of Internal Flaws in Bore and Web Geometries; (2) Electromagnetic Detection and Sizing; (3) New Technical Opportunities; and (4) New Flaw Detection Techniques.

  20. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  1. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  2. Evaluating Uncertainty to Strengthen Epidemiologic Data for Use in Human Health Risk Assessments

    PubMed Central

    Burns, Carol J.; Wright, J. Michael; Bateson, Thomas F.; Burstyn, Igor; Goldstein, Daniel A.; Klaunig, James E.; Luben, Thomas J.; Mihlan, Gary; Ritter, Leonard; Schnatter, A. Robert; Symons, J. Morel; Don Yi, Kun

    2014-01-01

    Background: There is a recognized need to improve the application of epidemiologic data in human health risk assessment especially for understanding and characterizing risks from environmental and occupational exposures. Although there is uncertainty associated with the results of most epidemiologic studies, techniques exist to characterize uncertainty that can be applied to improve weight-of-evidence evaluations and risk characterization efforts. Methods: This report derives from a Health and Environmental Sciences Institute (HESI) workshop held in Research Triangle Park, North Carolina, to discuss the utility of using epidemiologic data in risk assessments, including the use of advanced analytic methods to address sources of uncertainty. Epidemiologists, toxicologists, and risk assessors from academia, government, and industry convened to discuss uncertainty, exposure assessment, and application of analytic methods to address these challenges. Synthesis: Several recommendations emerged to help improve the utility of epidemiologic data in risk assessment. For example, improved characterization of uncertainty is needed to allow risk assessors to quantitatively assess potential sources of bias. Data are needed to facilitate this quantitative analysis, and interdisciplinary approaches will help ensure that sufficient information is collected for a thorough uncertainty evaluation. Advanced analytic methods and tools such as directed acyclic graphs (DAGs) and Bayesian statistical techniques can provide important insights and support interpretation of epidemiologic data. Conclusions: The discussions and recommendations from this workshop demonstrate that there are practical steps that the scientific community can adopt to strengthen epidemiologic data for decision making. Citation: Burns CJ, Wright JM, Pierson JB, Bateson TF, Burstyn I, Goldstein DA, Klaunig JE, Luben TJ, Mihlan G, Ritter L, Schnatter AR, Symons JM, Yi KD. 2014. Evaluating uncertainty to strengthen

  3. Evaluation of perioperative risk in elderly patients.

    PubMed

    Aubrun, F; Gazon, M; Schoeffler, M; Benyoub, K

    2012-05-01

    From a medical point of view, aging is characterized by a potential failure to maintain homeostasis under conditions of physiological stress. This failure is associated with an increase in vulnerability. Physiological changes associated with aging are progressive but concomitant injury or diseases may rapidly worsen the health status of the patient. Increasing age independently predicts morbidity and mortality. Hypertension and dyspnea are probably two of the most frequent risk factors in elderly patients. The history of the elderly patient should assess functional status, including cardiovascular reserve sufficient to withstand very stressful operations. The type of surgery has important implications for perioperative risk and emergency surgery, particularly in the elderly, is associated with a high risk of morbidity. Elderly patients who are otherwise acceptable surgical candidates should not be denied surgery based solely on their age and concerns for postoperative renal, cardiovascular, cognitive or pulmonary complications. Renal impairment becomes more prevalent with advancing age as the glomerular filtration rate decreases. The surgical site is the single most important predictor of pulmonary complications. Concerning postoperative comfort and neurological complications, age is the highest risk factor for developing dementia. Pain is underassessed and undermanaged. The elderly are at higher risk of adverse consequences from unrelieved or undertreated pain. PMID:22269928

  4. Quantitative evaluation of strategies for erosion control on a railway embankment batter

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Y.; Sibley, J.; Ashwath, N.

    2001-12-01

    Strategies for erosion control on a railway embankment batter (side slope) are quantitatively evaluated in this paper. The strategies were centred on control (do nothing treatment), grass seeding, gypsum application, jute mat (an erosion control blanket) placement and planting hedgerows of Monto vetiver grass. Rainfall and runoff were monitored at 1 min intervals on 10 m wide embankment batter plots during 1998 and 1999. Total bedload and suspended sediment eroded from the plots were also measured but only for a group of storm events within sampling intervals. It has been demonstrated that vetiver grass is not cost-effective in controlling erosion on railway batters within Central Queensland region. Seeding alone could cause 60% reduction in the erosion rate compared with the control treatment. Applying gypsum to the calcium-deficient soil before seeding yielded an additional 25% reduction in the erosion rate. This is the result, primarily, of 100% grass cover establishment within seven months of sowing. Therefore, for railway embankment batter erosion control, the emphasis needs to be on rapid establishment of 100% grass cover. For rapid establishment of grass cover, irrigation is necessary during the initial stages of growth as the rainfall is unpredictable and the potential evaporation exceeds rainfall in the study region. The risk of seeds and fertilizers being washed out by short-duration and high-intensity rainfall events during the establishment phase may be reduced by the use of erosion control blankets on sections of the batters. Accidental burning of grasses on some plots caused serious erosion problems, resulting in very slow recovery of grass growth. It is therefore recommended that controlled burning of grasses on railway batters should be avoided to protect batters from being exposed to severe erosion.

  5. Investigation of the genetic association between quantitative measures of psychosis and schizophrenia: a polygenic risk score analysis.

    PubMed

    Derks, Eske M; Vorstman, Jacob A S; Ripke, Stephan; Kahn, Rene S; Ophoff, Roel A

    2012-01-01

    The presence of subclinical levels of psychosis in the general population may imply that schizophrenia is the extreme expression of more or less continuously distributed traits in the population. In a previous study, we identified five quantitative measures of schizophrenia (positive, negative, disorganisation, mania, and depression scores). The aim of this study is to examine the association between a direct measure of genetic risk of schizophrenia and the five quantitative measures of psychosis. Estimates of the log of the odds ratios of case/control allelic association tests were obtained from the Psychiatric GWAS Consortium (PGC) (minus our sample) which included genome-wide genotype data of 8,690 schizophrenia cases and 11,831 controls. These data were used to calculate genetic risk scores in 314 schizophrenia cases and 148 controls from the Netherlands for whom genotype data and quantitative symptom scores were available. The genetic risk score of schizophrenia was significantly associated with case-control status (p<0.0001). In the case-control sample, the five psychosis dimensions were found to be significantly associated with genetic risk scores; the correlations ranged between.15 and.27 (all p<.001). However, these correlations were not significant in schizophrenia cases or controls separately. While this study confirms the presence of a genetic risk for schizophrenia as categorical diagnostic trait, we did not find evidence for the genetic risk underlying quantitative schizophrenia symptom dimensions. This does not necessarily imply that a genetic basis is nonexistent, but does suggest that it is distinct from the polygenic risk score for schizophrenia. PMID:22761660

  6. Investigation of the Genetic Association between Quantitative Measures of Psychosis and Schizophrenia: A Polygenic Risk Score Analysis

    PubMed Central

    Ripke, Stephan; Kahn, Rene S.; Ophoff, Roel A.

    2012-01-01

    The presence of subclinical levels of psychosis in the general population may imply that schizophrenia is the extreme expression of more or less continuously distributed traits in the population. In a previous study, we identified five quantitative measures of schizophrenia (positive, negative, disorganisation, mania, and depression scores). The aim of this study is to examine the association between a direct measure of genetic risk of schizophrenia and the five quantitative measures of psychosis. Estimates of the log of the odds ratios of case/control allelic association tests were obtained from the Psychiatric GWAS Consortium (PGC) (minus our sample) which included genome-wide genotype data of 8,690 schizophrenia cases and 11,831 controls. These data were used to calculate genetic risk scores in 314 schizophrenia cases and 148 controls from the Netherlands for whom genotype data and quantitative symptom scores were available. The genetic risk score of schizophrenia was significantly associated with case-control status (p<0.0001). In the case-control sample, the five psychosis dimensions were found to be significantly associated with genetic risk scores; the correlations ranged between.15 and.27 (all p<.001). However, these correlations were not significant in schizophrenia cases or controls separately. While this study confirms the presence of a genetic risk for schizophrenia as categorical diagnostic trait, we did not find evidence for the genetic risk underlying quantitative schizophrenia symptom dimensions. This does not necessarily imply that a genetic basis is nonexistent, but does suggest that it is distinct from the polygenic risk score for schizophrenia. PMID:22761660

  7. Evaluating the benefits of risk prevention initiatives

    NASA Astrophysics Data System (ADS)

    Di Baldassarre, G.

    2012-04-01

    The likelihood and adverse impacts of water-related disasters, such as floods and landslides, are increasing in many countries because of changes in climate and land-use. This presentation illustrates some preliminary results of a comprehensive demonstration of the benefits of risk prevention measures, carried out within the European FP7 KULTURisk project. The study is performed by using a variety of case studies characterised by diverse socio-economic contexts, different types of water-related hazards (floods, debris flows and landslides, storm surges) and space-time scales. In particular, the benefits of state-of-the-art prevention initiatives, such as early warning systems, non-structural options (e.g. mapping and planning), risk transfer strategies (e.g. insurance policy), and structural measures, are showed. Lastly, the importance of homogenising criteria to create hazard inventories and build memory, efficient risk communication and warning methods as well as active dialogue with and between public and private stakeholders, is highlighted.

  8. Evaluation of volcanic risk management in Merapi and Bromo Volcanoes

    NASA Astrophysics Data System (ADS)

    Bachri, S.; Stöetter, J.; Sartohadi, J.; Setiawan, M. A.

    2012-04-01

    Merapi (Central Java Province) and Bromo (East Java Province) volcanoes have human-environmental systems with unique characteristics, thus causing specific consequences on their risk management. Various efforts have been carried out by many parties (institutional government, scientists, and non-governmental organizations) to reduce the risk in these areas. However, it is likely that most of the actions have been done for temporary and partial purposes, leading to overlapping work and finally to a non-integrated scheme of volcanic risk management. This study, therefore, aims to identify and evaluate actions of risk and disaster reduction in Merapi and Bromo Volcanoes. To achieve this aims, a thorough literature review was carried out to identify earlier studies in both areas. Afterward, the basic concept of risk management cycle, consisting of risk assessment, risk reduction, event management and regeneration, is used to map those earlier studies and already implemented risk management actions in Merapi and Bromo. The results show that risk studies in Merapi have been developed predominantly on physical aspects of volcanic eruptions, i.e. models of lahar flows, hazard maps as well as other geophysical modeling. Furthermore, after the 2006 eruption of Merapi, research such on risk communication, social vulnerability, cultural vulnerability have appeared on the social side of risk management research. Apart from that, disaster risk management activities in the Bromo area were emphasizing on physical process and historical religious aspects. This overview of both study areas provides information on how risk studies have been used for managing the volcano disaster. This result confirms that most of earlier studies emphasize on the risk assessment and only few of them consider the risk reduction phase. Further investigation in this field work in the near future will accomplish the findings and contribute to formulate integrated volcanic risk management cycles for both

  9. Credit Risk Evaluation of Power Market Players with Random Forest

    NASA Astrophysics Data System (ADS)

    Umezawa, Yasushi; Mori, Hiroyuki

    A new method is proposed for credit risk evaluation in a power market. The credit risk evaluation is to measure the bankruptcy risk of the company. The power system liberalization results in new environment that puts emphasis on the profit maximization and the risk minimization. There is a high probability that the electricity transaction causes a risk between companies. So, power market players are concerned with the risk minimization. As a management strategy, a risk index is requested to evaluate the worth of the business partner. This paper proposes a new method for evaluating the credit risk with Random Forest (RF) that makes ensemble learning for the decision tree. RF is one of efficient data mining technique in clustering data and extracting relationship between input and output data. In addition, the method of generating pseudo-measurements is proposed to improve the performance of RF. The proposed method is successfully applied to real financial data of energy utilities in the power market. A comparison is made between the proposed and the conventional methods.

  10. EVALUATING RISK IN OLDER ADULTS USING PHYSIOLOGICALLY BASED PHARMACOKINETIC MODELS

    EPA Science Inventory

    The rapid growth in the number of older Americans has many implications for public health, including the need to better understand the risks posed by environmental exposures to older adults. An important element for evaluating risk is the understanding of the doses of environment...

  11. ASSESSMENT TOOLS FOR THE EVALUATION OF RISK (ASTER)

    EPA Science Inventory

    ASTER (ASsessment Tools for the Evaluation of Risk) was developed by the U.S. EPA Mid-Continent Ecology Division - Duluth (MED-Duluth) to assist regulators in performing ecological risk assessments. ASTER is an integration of the AQUIRE (AQUatic toxicity Information REtrieval) to...

  12. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses.

    PubMed

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-08-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10(-11)) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α 1 = 1, α 2 = 91; α 1 = 1, α 2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be -2.35 and -2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10(-14) and 3.58×10(-14), respectively. These results indicate that probability of C. perfringens foodborne illness

  13. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness

  14. Presenting quantitative information about decision outcomes: a risk communication primer for patient decision aid developers

    PubMed Central

    2013-01-01

    Background Making evidence-based decisions often requires comparison of two or more options. Research-based evidence may exist which quantifies how likely the outcomes are for each option. Understanding these numeric estimates improves patients’ risk perception and leads to better informed decision making. This paper summarises current “best practices” in communication of evidence-based numeric outcomes for developers of patient decision aids (PtDAs) and other health communication tools. Method An expert consensus group of fourteen researchers from North America, Europe, and Australasia identified eleven main issues in risk communication. Two experts for each issue wrote a “state of the art” summary of best evidence, drawing on the PtDA, health, psychological, and broader scientific literature. In addition, commonly used terms were defined and a set of guiding principles and key messages derived from the results. Results The eleven key components of risk communication were: 1) Presenting the chance an event will occur; 2) Presenting changes in numeric outcomes; 3) Outcome estimates for test and screening decisions; 4) Numeric estimates in context and with evaluative labels; 5) Conveying uncertainty; 6) Visual formats; 7) Tailoring estimates; 8) Formats for understanding outcomes over time; 9) Narrative methods for conveying the chance of an event; 10) Important skills for understanding numerical estimates; and 11) Interactive web-based formats. Guiding principles from the evidence summaries advise that risk communication formats should reflect the task required of the user, should always define a relevant reference class (i.e., denominator) over time, should aim to use a consistent format throughout documents, should avoid “1 in x” formats and variable denominators, consider the magnitude of numbers used and the possibility of format bias, and should take into account the numeracy and graph literacy of the audience. Conclusion A substantial and

  15. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    PubMed

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174

  16. A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis

    PubMed Central

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174

  17. Evaluation of Risk from Contaminants Migrating by Groundwater

    NASA Astrophysics Data System (ADS)

    AndričEvić, Roko; Cvetković, Vladimir

    1996-03-01

    The general formulation of the environmental risk problem captures the entire process of identifying the source term of the risk agent, its fate and transport through porous media, estimation of human exposure, and conversion of such exposure into the risk level. The contaminant fate and transport is modeled using the solute flux formulation evaluated with its first two moments, which explicitly account for the spatial variability of the velocity field, sorption properties, and parametric uncertainty through the first-order analysis. The risk level is quantified on the basis of carcinogenicity using the risk factor (which describes the risk per unit dose or unit intake) employed to the total doses for individuals potentially consuming radionuclide-contaminated groundwater. As a result of the probabilistic formulation in the solute flux and uncertainty in the water intake and dose-response functions, the total risk level is expressed as a distribution rather than a single estimate. The results indicate that the geologic heterogeneity and uncertainty in the sorption estimate are the two most important factors for the risk evaluation from the physical and chemical processes, while the mean risk factor is a crucial parameter in the risk formulation.

  18. The risk-adjusted cost evaluation of electric resource alternatives

    SciTech Connect

    Duane, T.P.

    1989-01-01

    Partial deregulation of the electric utility industry has occurred under the Public Utilities Regulatory Policies Act of 1978 (PURPA), which shifts the balance of both costs and risks between rate payers and electric utilities. Cost comparisons of potential electric resource Alternatives currently rely on techniques which do not explicitly incorporate risk consideration. This reflects the traditional role of regulation for rate stabilization. Risk-averse residential rate payers with low demand elasticities may highly value price risk reduction, but risk is not explicitly considered by present planning systems. There is a need to quantify the value of such price risk reduction. This research attempts to develop a Risk-Adjusted Cost Evaluation (RACE) methodology for direct comparisons of competing alternatives by a single risk-adjusted cost criterion. Methodologies have previously been developed for risk pricing in financial and commodities markets, and those techniques are evaluated for extension to the electricity market problem. Each has important deficiencies in the institutional context of electricity markets under PURPA; each also offers important insights for development of a simplified RACE methodology synthesizing those models. The methodology is applied to a large California utility, and major implementation problems are identified. The approach requires strict limiting conditions, and price risk reduction does not have a significant value to residential customers of PG and E. This may be less true for less well-diversified utilities, and several conditions are identified where more detailed assessment of risk implications is warranted. Future risk analyses research should instead focus on large, asymmetric risks. Suggestions are made for assessment of such risks through an insurance market metaphor and decision analysis methods.

  19. Effect of cholesterol lowering and cardiovascular risk factors on the progression of aortoiliac arteriosclerosis: a quantitative cineangiography study.

    PubMed

    Campeau, Lucien; Lespérance, Jacques; Bilodeau, Luc; Fortier, Annik; Guertin, Marie-Claude; Knatterud, Genell L

    2005-01-01

    The post-Coronary Artery Bypass Graft (Post-CABG) trial has shown that aggressive compared to moderate lowering of low-density lipoprotein cholesterol (LDL-C) delayed the progression of obstructive disease in aortocoronary saphenous vein grafts and in the left main coronary artery. Patients had been allocated to high-and low-dose lovastatin therapy for a 4-5 year period. The present study evaluated the effect of LDL-C lowering and the role of cardiovascular risk factors on the progression of arteriosclerosis in the distal abdominal aorta and common iliac arteries. From one of the participating centers of the post-CABG trial, 145 patients who had adequate imaging of the aortoiliac arteries at baseline and follow-up were included. Angiographic outcomes, presumed to reflect progression of arteriosclerosis and obtained from lumen diameter (LD) measurements using quantitative cineangiography, were as follows: significant decrease of the minimum lumen diameter (LD) and increase of the maximum LD, percent lumen stenosis, and percent lumen dilatation. These outcomes were not significantly less frequent in patients randomly allocated to aggressive compared to moderate LDL-C lowering. Of 9 cardiovascular risk factors, only 2 were significantly related to progression of aortoiliac arteriosclerosis. Current smoking predicted both percent lumen stenosis increase and, to a lesser degree, percent lumen dilatation increase (p = 0.010 and p = 0.055, respectively). Abnormally high body mass index (BMI > or = 25 kg/m2) correlated with percent lumen dilatation increase (p = 0.006). Aggressive compared to moderate LDL-C lowering did not prevent or delay the progression of aortoiliac arteriosclerosis. Smoking predicted both lumen narrowing and dilatation presumably caused by arteriosclerosis. Abnormally high BMI, reflecting overweight or obesity, was strongly associated with vessel dilatation. PMID:15793608

  20. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    PubMed Central

    Hines, Ronald N.; Sargent, Dana; Autrup, Herman; Birnbaum, Linda S.; Brent, Robert L.; Doerrer, Nancy G.; Cohen Hubal, Elaine A.; Juberg, Daland R.; Laurent, Christian; Luebke, Robert; Olejniczak, Klaus; Portier, Christopher J.; Slikker, William

    2010-01-01

    Assessing the risk profiles of potentially sensitive populations requires a “tool chest” of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of the pediatric population. The Health and Environmental Sciences Institute Subcommittee on Risk Assessment of Sensitive Populations evaluated key references in the area of pediatric risk to identify a spectrum of methodological approaches. These approaches are considered in this article for their potential to be extrapolated for the identification and assessment of other sensitive populations. Recommendations as to future research needs and/or alternate methodological considerations are also made. PMID:19770482

  1. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  2. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  3. Evaluation of quantitative accuracy in CZT-based pre-clinical SPECT for various isotopes

    NASA Astrophysics Data System (ADS)

    Park, S.-J.; Yu, A. R.; Kim, Y.-s.; Kang, W.-S.; Jin, S. S.; Kim, J.-S.; Son, T. J.; Kim, H.-J.

    2015-05-01

    In vivo pre-clinical single-photon emission computed tomography (SPECT) is a valuable tool for functional small animal imaging, but several physical factors, such as scatter radiation, limit the quantitative accuracy of conventional scintillation crystal-based SPECT. Semiconductor detectors such as CZT overcome these deficiencies through superior energy resolution. To our knowledge, little scientific information exists regarding the accuracy of quantitative analysis in CZT-based pre-clinical SPECT systems for different isotopes. The aim of this study was to assess the quantitative accuracy of CZT-based pre-clinical SPECT for four isotopes: 201Tl, 99mTc, 123I, and 111In. The quantitative accuracy of the CZT-based Triumph X-SPECT (Gamma-Medica Ideas, Northridge, CA, U.S.A.) was compared with that of a conventional SPECT using GATE simulation. Quantitative errors due to the attenuation and scatter effects were evaluated for all four isotopes with energy windows of 5%, 10%, and 20%. A spherical source containing the isotope was placed at the center of the air-or-water-filled mouse-sized cylinder phantom. The CZT-based pre-clinical SPECT was more accurate than the conventional SPECT. For example, in the conventional SPECT with an energy window of 10%, scatter effects degraded quantitative accuracy by up to 11.52%, 5.10%, 2.88%, and 1.84% for 201Tl, 99mTc, 123I, and 111In, respectively. However, with the CZT-based pre-clinical SPECT, the degradations were only 9.67%, 5.45%, 2.36%, and 1.24% for 201Tl, 99mTc, 123I, and 111In, respectively. As the energy window was increased, the quantitative errors increased in both SPECT systems. Additionally, the isotopes with lower energy of photon emissions had greater quantitative error. Our results demonstrated that the CZT-based pre-clinical SPECT had lower overall quantitative errors due to reduced scatter and high detection efficiency. Furthermore, the results of this systematic assessment quantifying the accuracy of these SPECT

  4. Quantitative Methods for Evaluating the Efficacy of Thalamic Deep Brain Stimulation in Patients with Essential Tremor

    PubMed Central

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Background Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. Methods We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. Results The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Discussion Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life. PMID:24255800

  5. Evaluating risk factor assumptions: a simulation-based approach

    PubMed Central

    2011-01-01

    Background Microsimulation models are an important tool for estimating the comparative effectiveness of interventions through prediction of individual-level disease outcomes for a hypothetical population. To estimate the effectiveness of interventions targeted toward high risk groups, the mechanism by which risk factors influence the natural history of disease must be specified. We propose a method for evaluating these risk factor assumptions as part of model-building. Methods We used simulation studies to examine the impact of risk factor assumptions on the relative rate (RR) of colorectal cancer (CRC) incidence and mortality for a cohort with a risk factor compared to a cohort without the risk factor using an extension of the CRC-SPIN model for colorectal cancer. We also compared the impact of changing age at initiation of screening colonoscopy for different risk mechanisms. Results Across CRC-specific risk factor mechanisms, the RR of CRC incidence and mortality decreased (towards one) with increasing age. The rate of change in RRs across age groups depended on both the risk factor mechanism and the strength of the risk factor effect. Increased non-CRC mortality attenuated the effect of CRC-specific risk factors on the RR of CRC when both were present. For each risk factor mechanism, earlier initiation of screening resulted in more life years gained, though the magnitude of life years gained varied across risk mechanisms. Conclusions Simulation studies can provide insight into both the effect of risk factor assumptions on model predictions and the type of data needed to calibrate risk factor models. PMID:21899767

  6. Novel methods to evaluate fracture risk models

    PubMed Central

    Donaldson, M.G.; Cawthon, P. M.; Schousboe, J.T.; Ensrud, K.E.; Lui, L.Y.; Cauley, J.A.; Hillier, T.A.; Taylor, B.C.; Hochberg, M.C.; Bauer, D.C.; Cummings, S.R.

    2013-01-01

    Fracture prediction models help identify individuals at high risk who may benefit from treatment. Area Under the Curve (AUC) is used to compare prediction models. However, the AUC has limitations and may miss important differences between models. Novel reclassification methods quantify how accurately models classify patients who benefit from treatment and the proportion of patients above/below treatment thresholds. We applied two reclassification methods, using the NOF treatment thresholds, to compare two risk models: femoral neck BMD and age (“simple model”) and FRAX (”FRAX model”). The Pepe method classifies based on case/non-case status and examines the proportion of each above and below thresholds. The Cook method examines fracture rates above and below thresholds. We applied these to the Study of Osteoporotic Fractures. There were 6036 (1037 fractures) and 6232 (389 fractures) participants with complete data for major osteoporotic and hip fracture respectively. Both models for major osteoporotic fracture (0.68 vs. 0.69) and hip fracture (0.75 vs. 0.76) had similar AUCs. In contrast, using reclassification methods, each model classified a substantial number of women differently. Using the Pepe method, the FRAX model (vs. simple model), missed treating 70 (7%) cases of major osteoporotic fracture but avoided treating 285 (6%) non-cases. For hip fracture, the FRAX model missed treating 31 (8%) cases but avoided treating 1026 (18%) non-cases. The Cook method (both models, both fracture outcomes) had similar fracture rates above/below the treatment thresholds. Compared with the AUC, new methods provide more detailed information about how models classify patients. PMID:21351143

  7. Primer for evaluating ecological risk at petroleum release sites.

    PubMed

    Claff, R

    1999-02-01

    Increasingly, risk-based approaches are being used to guide decision making at sites such as service stations and petroleum product terminals, where petroleum products have been inadvertently released to the soil. For example, the API Decision Support System software, DSS, evaluates site human health risk along six different routes of exposure. The American Society for Testing and Materials' Risk-Based Corrective Action (RBCA) standard, ASTM 1739, establishes a tiered framework for evaluating petroleum release sites on the basis of human health risk. Though much of the risk assessment focus has been on human health risk, regulatory agencies recognize that protection of human health may not fully protect the environment; and EPA has developed guidance on identifying ecological resources to be protected through risk-based decision making. Not every service station or petroleum product terminal site warrants a detailed ecological risk assessment. In some cases, a simple preliminary assessment will provide sufficient information for decision making. Accordingly, the American Petroleum Institute (API) is developing a primer for site managers, to assist them in conducting this preliminary assessment, and in deciding whether more detailed ecological risk assessments are warranted. The primer assists the site manager in identifying relevant ecological receptors and habitats, in identifying chemicals and exposure pathways of concern, in developing a conceptual model of the site to guide subsequent actions, and in identifying conditions that may warrant immediate response. PMID:10189585

  8. Application of quantitative estimates of fecal hemoglobin concentration for risk prediction of colorectal neoplasia

    PubMed Central

    Liao, Chao-Sheng; Lin, Yu-Min; Chang, Hung-Chuen; Chen, Yu-Hung; Chong, Lee-Won; Chen, Chun-Hao; Lin, Yueh-Shih; Yang, Kuo-Ching; Shih, Chia-Hui

    2013-01-01

    AIM: To determine the role of the fecal immunochemical test (FIT), used to evaluate fecal hemoglobin concentration, in the prediction of histological grade and risk of colorectal tumors. METHODS: We enrolled 17881 individuals who attended the two-step colorectal cancer screening program in a single hospital between January 2010 and October 2011. Colonoscopy was recommended to the participants with an FIT of ≥ 12 ngHb/mL buffer. We classified colorectal lesions as cancer (C), advanced adenoma (AA), adenoma (A), and others (O) by their colonoscopic and histological findings. Multiple linear regression analysis adjusted for age and gender was used to determine the association between the FIT results and colorectal tumor grade. The risk of adenomatous neoplasia was estimated by calculating the positive predictive values for different FIT concentrations. RESULTS: The positive rate of the FIT was 10.9% (1948/17881). The attendance rate for colonoscopy was 63.1% (1229/1948). The number of false positive results was 23. Of these 1229 cases, the numbers of O, A, AA, and C were 759, 221, 201, and 48, respectively. Regression analysis revealed a positive association between histological grade and FIT concentration (β = 0.088, P < 0.01). A significant log-linear relationship was found between the concentration and positive predictive value of the FIT for predicting colorectal tumors (R2 > 0.95, P < 0.001). CONCLUSION: Higher FIT concentrations are associated with more advanced histological grades. Risk prediction for colorectal neoplasia based on individual FIT concentrations is significant and may help to improve the performance of screening programs. PMID:24363529

  9. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk.

    PubMed

    Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan

    2013-01-01

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality. PMID:23369514

  10. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk

    PubMed Central

    2013-01-01

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality. PMID:23369514

  11. Quantitative risk assessment of lung cancer in U. S. uranium miners

    SciTech Connect

    Hornung, R.W.; Meinhardt, T.J.

    1986-01-16

    The mortality experience of a cohort of 3346 underground uranium miners evaluated in 1977 was updated through 1982. As of 1982, there were 1214 miners who were deceased; 255 had died of lung cancer. Variables considered in the development of the model included cumulative exposure, exposure rate, cumulative cigarette smoking, smoking rate, age at initial exposure, calendar year of initial exposure, birth year, height, duration of underground employment, and years of prior hardrock mining. Cumulative cigarette smoking and cumulative radon daughter exposure had a joint effect intermediate between additive and multiplicative, implying a synergistic relationship. Results indicated that modeling cumulative exposure alone may not adequately predict the relative risk of lung cancer from chronic exposure to radon daughters. Miners receiving a given amount of cumulative exposure at lower rates for longer periods of time were at greater risk relative to those with the same cumulative exposure received at higher rates for shorter time periods. Data suggested that radon daughters act at a late stage in the carcinogenic process. The epidemiologic model developed for the study was found to provide a very good fit to data from 60 to 6000 working level months.

  12. Rape prevention with college men: evaluating risk status.

    PubMed

    Stephens, Kari A; George, William H

    2009-06-01

    This study evaluates the effectiveness of a theoretically based rape prevention intervention with college men who were at high or low risk to perpetrate sexually coercive behavior. Participants (N = 146) are randomly assigned to the intervention or control group. Outcomes include rape myth acceptance, victim empathy, attraction to sexual aggression, sex-related alcohol expectancies, and behavioral indicators, measured across three time points. Positive effects are found for rape myth acceptance, victim empathy, attraction to sexual aggression, and behavioral intentions to rape. Only rape myth acceptance and victim empathy effects sustain at the 5-week follow-up. High-risk men are generally unaffected by the intervention although low-risk men produced larger effects than the entire sample. Results suggest rape prevention studies must assess risk status moderation effects to maximize prevention for high-risk men. More research is needed to develop effective rape prevention with men who are at high risk to rape. PMID:18591366

  13. Food and Drug Administration Evaluation and Cigarette Smoking Risk Perceptions

    ERIC Educational Resources Information Center

    Kaufman, Annette R.; Waters, Erika A.; Parascandola, Mark; Augustson, Erik M.; Bansal-Travers, Maansi; Hyland, Andrew; Cummings, K. Michael

    2011-01-01

    Objectives: To examine the relationship between a belief about Food and Drug Administration (FDA) safety evaluation of cigarettes and smoking risk perceptions. Methods: A nationally representative, random-digit-dialed telephone survey of 1046 adult current cigarette smokers. Results: Smokers reporting that the FDA does not evaluate cigarettes for…

  14. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  15. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  16. An overview of BWR Mark-1 containment venting risk implications: An evaluation of potential Mark-1 containment improvements

    SciTech Connect

    Wagner, K.C.; Dallman, R.J.; Galyean, W.J.

    1989-06-01

    This report supplements containment venting risk evaluations performed for the Mark-I Containment Performance Improvement (CPI) Program. Quantitative evaluations using simplified containment event trees for station blackout sequences were performed to evaluate potential risk reduction offered by containment venting, and improved automatic depressurization system with a dedicated power source, and an additional supply of water to either the containment sprays or the vessel with a dedicated power source. The risk calculations were based on the Draft NUREG-1150 results for Peach Bottom with selected enhancements. Several sensitivity studies were performed to investigate phenomenological, operational, and equipment performance uncertainties. Qualitative risk evaluations were provided for loss of long-term containment heat removal and anticipated transients without scram for the same set of improvements. A limited discussion is provided on the generic applicability of these results to other plants with Mark-I containments. 23 refs., 15 figs., 13 tabs.

  17. Injury risk evaluation in sport climbing.

    PubMed

    Neuhof, A; Hennig, F F; Schöffl, I; Schöffl, V

    2011-10-01

    The aim of this study was to quantify and rate acute sport climbing injuries. Acute sport climbing injuries occurring from 2002 to 2006 were retrospectively assessed with a standardized web based questionnaire. A total number of 1962 climbers reported 699 injuries, which is equivalent to 0.2 injuries per 1 000 h of sport participation. Most (74.4%) of the injuries were of minor severity rated NACA I or NACA II. Injury distribution between the upper (42.6%) and lower extremities (41.3%) was similar, with ligament injuries, contusions and fractures being the most common injury types. Years of climbing experience (p<0.01), difficulty level (p<0.01), climbing time per week during summer (p<0.01) and winter (p<0.01) months were correlated with the injury rate. Age (p<0.05 (p=0.034)), years of climbing experience (p<0.01) and average climbing level (p<0.01) were correlated to the injury severity rated through NACA scores. The risk of acute injuries per 1 000 h of sport participation in sport climbing was lower than in previous studies on general rock climbing and higher than in studies on indoor climbing. In order to perform inter-study comparisons of future studies on climbing injuries, the use of a systematic and standardized scoring system (UIAA score) is essential. PMID:21913158

  18. QUANTITATIVE EVALUATION OF ANTERIOR SEGMENT PARAMETERS IN THE ERA OF IMAGING

    PubMed Central

    Dorairaj, Syril; Liebmann, Jeffrey M.; Ritch, Robert

    2007-01-01

    Purpose To review the parameters for quantitative assessment of the anterior segment and iridocorneal angle and to develop a comprehensive schematic for the evaluation of angle anatomy and pathophysiology by high-resolution imaging. Methods The published literature of the last 15 years was reviewed, analyzed, and organized into a construct for assessment of anterior segment processes. Results Modern anterior segment imaging techniques have allowed us to devise new quantitative parameters to improve the information obtained. Ultrasound biomicroscopy, slit-lamp optical coherence tomography, and anterior segment optical coherence tomography provide high-resolution images for analysis of physiologic and pathologic processes. These include iridocorneal angle analysis (eg, angle opening distance, angle recess area, trabecular-iris space area), anterior and posterior chamber depth and area, iris and ciliary body cross-sectional area and volume, quantitative anatomic relationships between structures, and videographic analysis of iris movement and accommodative changes under various conditions. Modern devices permit imaging of the entire anterior chamber, allowing calculation of anterior chamber and pupillary diameters and correlating these with measurement of anterior chamber dynamics in light vs dark conditions. We have tabulated all reported anterior segment measurement modalities and devised a construct for assessment of normal and abnormal conditions. Conclusion Quantitative measurement of static and dynamic anterior segment parameters, both normal and abnormal, provides a broad range of parameters for analysis of the numerous aspects of the pathophysiology of the anterior segment of the eye. PMID:18427599

  19. Risk-Based Evaluation of Total Petroleum Hydrocarbons in Vapor Intrusion Studies

    PubMed Central

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-01-01

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum. PMID:23765191

  20. Risk-based evaluation of total petroleum hydrocarbons in vapor intrusion studies.

    PubMed

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-06-01

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum. PMID:23765191

  1. Quantitative evaluation of image registration techniques in the case of retinal images

    NASA Astrophysics Data System (ADS)

    Gavet, Yann; Fernandes, Mathieu; Pinoli, Jean-Charles

    2012-04-01

    In human retina observation (with non mydriatic optical microscopes), an image registration process is often employed to enlarge the field of view. Analyzing all the images takes a lot of time. Numerous techniques have been proposed to perform the registration process. Its good evaluation is a difficult question that is then raising. This article presents the use of two quantitative criterions to evaluate and compare some classical feature-based image registration techniques. The images are first segmented and the resulting binary images are then registered. The good quality of the registration process is evaluated with a normalized criterion based on the ɛ dissimilarity criterion, and the figure of merit criterion (fom), for 25 pairs of images with a manual selection of control points. These criterions are normalized by the results of the affine method (considered as the most simple method). Then, for each pair, the influence of the number of points used to perform the registration is evaluated.

  2. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  3. Evaluation and Quantitative Prediction of Renal Transporter-Mediated Drug-Drug Interactions.

    PubMed

    Feng, Bo; Varma, Manthena V

    2016-07-01

    With numerous drugs cleared renally, inhibition of uptake transporters localized on the basolateral membrane of renal proximal tubule cells, eg, organic anion transporters (OATs) and organic cation transporters (OCTs), may lead to clinically meaningful drug-drug interactions (DDIs). Additionally, clinical evidence for the possible involvement of efflux transporters, such as P-glycoprotein (P-gp) and multidrug and toxin extrusion protein 1/2-K (MATE1/2-K), in the renal DDIs is emerging. Herein, we review recent progress regarding mechanistic understanding of transporter-mediated renal DDIs as well as the quantitative predictability of renal DDIs using static and physiologically based pharmacokinetic (PBPK) models. Generally, clinical DDI data suggest that the magnitude of plasma exposure changes attributable to renal DDIs is less than 2-fold, unlike the DDIs associated with inhibition of cytochrome P-450s and/or hepatic uptake transporters. It is concluded that although there is a need for risk assessment early in drug development, current available data imply that safety concerns related to the renal DDIs are generally low. Nevertheless, consideration must be given to the therapeutic index of the victim drug and potential risk in a specific patient population (eg, renal impairment). Finally, in vitro transporter data and clinical pharmacokinetic parameters obtained from the first-in-human studies have proven useful in support of quantitative prediction of DDIs associated with inhibition of renal secretory transporters, OATs or OCTs. PMID:27385169

  4. Quantitative morphological evaluation of laser ablation on calculus using full-field optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; Lü, T.; Li, Z.; Fu, L.

    2011-10-01

    The quantitative morphological evaluation at high resolution is of significance for the study of laser-tissue interaction. In this paper, a full-field optical coherence microscopy (OCM) system with high resolution of ˜2 μm was developed to investigate the ablation on urinary calculus by a free-running Er:YAG laser. We studied the morphological variation quantitatively corresponding to change of energy setting of the Er:YAG laser. The experimental results show that the full-field OCM enables quantitative evaluation of the morphological shape of craters and material removal, and particularly the fine structure. We also built a heat conduction model to simulate the process of laser-calculus interaction by using finite element method. Through the simulation, the removal region of the calculus was calculated according to the temperature distribution. As a result, the depth, width, volume, and the cross-sectional profile of the crater in calculus measured by full-field OCM matched well with the theoretical results based on the heat conduction model. Both experimental and theoretical results confirm that the thermal interaction is the dominant effect in the ablation of calculus by Er:YAG laser, demonstrating the effectiveness of full-field OCM in studying laser-tissue interactions.

  5. Quantitative morphologic evaluation of magnetic resonance imaging during and after treatment of childhood leukemia

    PubMed Central

    Reddick, Wilburn E.; Laningham, Fred H.; Glass, John O.; Pui, Ching-Hon

    2008-01-01

    Introduction Medical advances over the last several decades, including CNS prophylaxis, have greatly increased survival in children with leukemia. As survival rates have increased, clinicians and scientists have been afforded the opportunity to further develop treatments to improve the quality of life of survivors by minimizing the long-term adverse effects. When evaluating the effect of antileukemia therapy on the developing brain, magnetic resonance (MR) imaging has been the preferred modality because it quantifies morphologic changes objectively and noninvasively. Method and results Computer-aided detection of changes on neuroimages enables us to objectively differentiate leukoencephalopathy from normal maturation of the developing brain. Quantitative tissue segmentation algorithms and relaxometry measures have been used to determine the prevalence, extent, and intensity of white matter changes that occur during therapy. More recently, diffusion tensor imaging has been used to quantify microstructural changes in the integrity of the white matter fiber tracts. MR perfusion imaging can be used to noninvasively monitor vascular changes during therapy. Changes in quantitative MR measures have been associated, to some degree, with changes in neurocognitive function during and after treatment Conclusion In this review, we present recent advances in quantitative evaluation of MR imaging and discuss how these methods hold the promise to further elucidate the pathophysiologic effects of treatment for childhood leukemia. PMID:17653705

  6. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. PMID:27566933

  7. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  8. Effect of Surface Sampling and Recovery of Viruses and Non-Spore-Forming Bacteria on a Quantitative Microbial Risk Assessment Model for Fomites.

    PubMed

    Weir, Mark H; Shibata, Tomoyuki; Masago, Yoshifumi; Cologgi, Dena L; Rose, Joan B

    2016-06-01

    Quantitative microbial risk assessment (QMRA) is a powerful decision analytics tool, yet it faces challenges when modeling health risks for the indoor environment. One limitation is uncertainty in fomite recovery for evaluating the efficiency of decontamination. Addressing this data gap has become more important as a result of response and recovery from a potential malicious pathogen release. To develop more accurate QMRA models, recovery efficiency from non-porous fomites (aluminum, ceramic, glass, plastic, steel, and wood laminate) was investigated. Fomite material, surface area (10, 100, and 900 cm(2)), recovery tool (swabs and wipes), initial concentration on the fomites and eluent (polysorbate 80, trypticase soy broth, and beef extract) were evaluated in this research. Recovery was shown to be optimized using polysorbate 80, sampling with wipes, and sampling a surface area of 10-100 cm(2). The QMRA model demonstrated, through a relative risk comparison, the need for recovery efficiency to be used in these models to prevent underestimated risks. PMID:27154208

  9. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  10. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  11. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    PubMed Central

    Song, Yu; Kwak, Shin; Yoshida, Sohei; Yamamoto, Yoshiharu

    2014-01-01

    Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient's movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson's disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD) for vascular dementia (VD), seasonal affective disorder (SAD), and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records. PMID:25214709

  12. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    PubMed Central

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  13. Evaluation of the predictability of real-time crash risk models.

    PubMed

    Xu, Chengcheng; Liu, Pan; Wang, Wei

    2016-09-01

    The primary objective of the present study was to investigate the predictability of crash risk models that were developed using high-resolution real-time traffic data. More specifically the present study sought answers to the following questions: (a) how to evaluate the predictability of a real-time crash risk model; and (b) how to improve the predictability of a real-time crash risk model. The predictability is defined as the crash probability given the crash precursor identified by the crash risk model. An equation was derived based on the Bayes' theorem for estimating approximately the predictability of crash risk models. The estimated predictability was then used to quantitatively evaluate the effects of the threshold of crash precursors, the matched and unmatched case-control design, and the control-to-case ratio on the predictability of crash risk models. It was found that: (a) the predictability of a crash risk model can be measured as the product of prior crash probability and the ratio between sensitivity and false alarm rate; (b) there is a trade-off between the predictability and sensitivity of a real-time crash risk model; (c) for a given level of sensitivity, the predictability of the crash risk model that is developed using the unmatched case-controlled sample is always better than that of the model developed using the matched case-controlled sample; and (d) when the control-to-case ratio is beyond 4:1, the increase in control-to-case ratio does not lead to clear improvements in predictability. PMID:27332063

  14. Perception of risks from electromagnetic fields: A psychometric evaluation of a risk-communication approach

    SciTech Connect

    MacGregor, D.G.; Slovic, P. ); Morgan, M.G. )

    1994-10-01

    Potential health risks from exposure to power-frequency electromagnetic fields (EMF) have become an issue of significant public concern. This study evaluates a brochure designed to communicate EMF health risks from a scientific perspective. The study utilized a pretest-posttest design in which respondents judged various sources of EMF (and other) health and safety risks, both before reaching the brochure and after. Respondents assessed risks on dimensions similar to those utilized in previous studies of risk perception. In addition, detailed ratings were made that probed respondents' beliefs about the possible causal effects of EMF exposure. The findings suggest that naive beliefs about the potential of EMF exposure to cause harm were highly influenced by specific content elements of the brochure. The implications for using risk-communication approaches based on communicating scientific uncertainty are discussed. 19 refs., 1 fig., 11 tabs.

  15. Food and Drug Administration Evaluation and Cigarette Smoking Risk Perceptions

    PubMed Central

    Kaufman, Annette R.; Waters, Erika A.; Parascandola, Mark; Augustson, Erik M.; Bansal-Travers, Maansi; Hyland, Andrew; Cummings, K. Michael

    2013-01-01

    Objectives To examine the relationship between a belief about Food and Drug Administration (FDA) safety evaluation of cigarettes and smoking risk perceptions. Methods A nationally representative, random-digit-dialed telephone survey of 1046 adult current cigarette smokers. Results Smokers reporting that the FDA does not evaluate cigarettes for safety (46.1%), exhibited greater comprehension of the health risks of smoking and were more likely (48.5%) than other participants (33.6%) to report quit intentions. Risk perceptions partially mediated the relationship between FDA evaluation belief and quit intentions. Conclusions These findings highlight the need for proactive, effective communication to the public about the aims of new tobacco product regulations. PMID:22251767

  16. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  17. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  18. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  19. Conceptual Model of Offshore Wind Environmental Risk Evaluation System

    SciTech Connect

    Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.

    2010-06-01

    In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.

  20. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    NASA Astrophysics Data System (ADS)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  1. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder

    PubMed Central

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7–11 years (27 males, six females) and twenty five adults participants aged 21–29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  2. A quantitative evaluation of various deconvolution methods and their applications in the deconvolution of plasma spectra

    NASA Astrophysics Data System (ADS)

    Xiong, Yanwei; Shi, Yuejiang; Li, Yingying; Fu, Jia; Lu, Bo; Zhang, Hongming; Wang, Xiaoguang; Wang, Fudi; Shen, Yongcai

    2013-06-01

    A quantitative evaluation of various deconvolution methods and their applications in processing plasma emitted spectra was performed. The iterative deconvolution algorithms evaluated here include Jansson's method, Richardson-Lucy's method, the maximum a posteriori method and Gold's method. The evaluation criteria include minimization of the sum of squared errors and the sum of squared relative error of parameters, and their rate of convergence. After comparing deconvolved results using these methods, it was concluded that Jansson's and Gold's methods were able to provide good profiles that are visually close to the original spectra. Additionally, Gold's method generally gives the best results when considering all the criteria above. The applications to the actual plasma spectra obtained from the EAST tokamak with these methods are also presented in this paper. The deconvolution results with Gold's and Jansson's methods show that the effects of instruments can be satisfactorily eliminated and clear spectra are recovered.

  3. Automatic quantitative evaluation of autoradiographic band films by computerized image analysis

    SciTech Connect

    Masseroli, M.; Messori, A.; Bendotti, C.; Ponti, M.; Forloni, G. )

    1993-01-01

    The present paper describes a new image processing method for automatic quantitative analysis of autoradiographic band films. It was developed in a specific image analysis environment (IBAS 2.0) but the algorithms and methods can be utilized elsewhere. The program is easy to use and presents some particularly useful features for evaluation of autoradiographic band films, such as the choice of whole film or single lane background determination; the possibility of evaluating bands with film scratch artifacts and the quantification in absolute terms or relative to reference values. The method was tested by comparison with laser-scanner densitometric quantifications of the same autoradiograms. The results show the full compatibility of the two methods and demonstrate the reliability and sensitivity of image analysis. The method can be used not only to evaluate autoradiographic band films, but to analyze any type of signal bands on other materials (e.g electrophoresis gel, chromatographic paper, etc.).

  4. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  5. Reanalysis of the DEMS Nested Case-Control Study of Lung Cancer and Diesel Exhaust: Suitability for Quantitative Risk Assessment

    PubMed Central

    Crump, Kenny S; Van Landingham, Cynthia; Moolgavkar, Suresh H; McClellan, Roger

    2015-01-01

    The International Agency for Research on Cancer (IARC) in 2012 upgraded its hazard characterization of diesel engine exhaust (DEE) to “carcinogenic to humans.” The Diesel Exhaust in Miners Study (DEMS) cohort and nested case-control studies of lung cancer mortality in eight U.S. nonmetal mines were influential in IARC’s determination. We conducted a reanalysis of the DEMS case-control data to evaluate its suitability for quantitative risk assessment (QRA). Our reanalysis used conditional logistic regression and adjusted for cigarette smoking in a manner similar to the original DEMS analysis. However, we included additional estimates of DEE exposure and adjustment for radon exposure. In addition to applying three DEE exposure estimates developed by DEMS, we applied six alternative estimates. Without adjusting for radon, our results were similar to those in the original DEMS analysis: all but one of the nine DEE exposure estimates showed evidence of an association between DEE exposure and lung cancer mortality, with trend slopes differing only by about a factor of two. When exposure to radon was adjusted, the evidence for a DEE effect was greatly diminished, but was still present in some analyses that utilized the three original DEMS DEE exposure estimates. A DEE effect was not observed when the six alternative DEE exposure estimates were utilized and radon was adjusted. No consistent evidence of a DEE effect was found among miners who worked only underground. This article highlights some issues that should be addressed in any use of the DEMS data in developing a QRA for DEE. PMID:25857246

  6. Quantitative local cerebral blood flow measurements with technetium-99m HM-PAO: evaluation using multiple radionuclide digital quantitative autoradiography

    SciTech Connect

    Lear, J.L.

    1988-08-01

    We investigated d,1 (/sup 99m/Tc)hexamethylpropyleneamine oxime complex (HM-PAO) as a tracer for quantitative measurement of local cerebral blood flow (LCBF) in a series of awake male rats. LCBF measurements with HM-PAO were compared to those of two other tracers, (/sup 14/C) iodoantipyrine (IAP) and (/sup 201/Tl)diethyldithiocarbamate (DDC), using quantitative double and triple tracer digital autoradiography. LCBF values with HM-PAO averaged 64% those of IAP and were generally linearly related. Detailed analysis suggested that the underestimation of LCBF by HM-PAO was related to blood constituent binding and/or rapid conversion to a noncerebrophilic compound, as well as noninstantaneous cerebral trapping, rather than to diffusion limitation.

  7. Quantitative evaluation of unrestrained human gait on change in walking velocity.

    PubMed

    Makino, Yuta; Tsujiuchi, Nobutaka; Ito, Akihito; Koizumi, Takayuki; Nakamura, Shota; Matsuda, Yasushi; Tsuchiya, Youtaro; Hayashi, Yuichiro

    2014-01-01

    In human gait motion analysis, which is one useful method for efficient physical rehabilitation to define various quantitative evaluation indices, ground reaction force, joint angle and joint loads are measured during gait. To obtain these data as unrestrained gait measurement, a novel gait motion analysis system using mobile force plates and attitude sensors has been developed. On the other hand, a human maintains a high correlation among the motion of all joints during gait. The analysis of the correlation in the recorded joint motion extracts a few simultaneously activating segmental coordination patterns, and the structure of the intersegmental coordination is attracting attention to an expected relationship with a control strategy. However, when the evaluation method using singular value decomposition has been applied to joint angles of the lower limb as representative kinematic parameters, joint moments related to the rotational motion of the joints have not yet been considered. In this paper, joint moments as kinetic parameters applied on the lower limb during gait of a normal subject and a trans-femoral amputee are analyzed under change in walking velocity by the wearable gait motion analysis system, and the effectiveness for quantitatively evaluate the rotational motion pattern in the joints of the lower limb by using joint moments is validated. PMID:25570503

  8. The quantitative evaluation of the correlation between the magnification and the visibility-contrast value

    NASA Astrophysics Data System (ADS)

    Okubo, Shohei; Shibata, Takayuki; Kodera, Yoshie

    2015-03-01

    Talbot-Lau interferometer, which consists of a conventional x-ray tube, an x-ray detector, and three gratings arranged between them, is a new x-ray imaging system using phase-contrast method for excellent visualization of soft tissue. So, it is expected to be applied to an imaging method for soft tissue in the medical field, such as mammograms. The visibility-contrast image, which is one of the reconstruction images using Talbot-Lau interferometer, is known that the visibility-contrast reflects reduction of coherence that is caused from the x-ray small-angle scattering and the x-ray refraction due to the object's structures. Both phenomena were not distinguished when we evaluated the visibility signal quantitatively before. However, we consider that we should distinguish both phenomena to evaluate it quantitatively. In this study, to evaluate how much the magnification affect the visibility signal, we investigated the variability rate of the visibility signal between the object-position in the height of 0 cm to 50 cm from the diffraction grating in each case of examining the scattering signal and the refraction signal. We measured the edge signal of glass sphere to examine the scattering signal and the internal signal of glass sphere and some kinds of sheet to examine the refraction signal. We can indicate the difference of the variability rate between the edge signal and the internal signal. We tried to propose the estimation method using magnification.

  9. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  10. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  11. A quantitative method evaluating the selective adsorption of molecularly imprinted polymer.

    PubMed

    Zhang, Z B; Hu, J Y

    2012-01-01

    Adsorption isotherms of 4 estrogenic compounds, estrone, 17β-estradiol, 17α-ethinylestradiol and Bisphenol A, using molecularly imprinted polymer were studied. The isotherms can be simulated by Langmuir model. According to the adsorption isotherms and the template's mass balance, an experimental concept, selective adsorption ratio, SAR, was proposed to assess how many template molecules extracted out of MIP could create selective binding sites. The SAR of the molecularly imprinted polymer was 74.3% for E2. This concept could be used to evaluate quantitatively the selective adsorption. PMID:22423989

  12. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  13. Quantitative and qualitative evaluation of PERCEPT indoor navigation system for visually impaired users.

    PubMed

    Ganz, Aura; Schafer, James; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    In this paper we introduce qualitative and quantitative evaluation of PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT system trials with 24 blind and visually impaired users in a multi-story building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows Orientation and Mobility principles. These results encourage us to generalize the solution to large indoor spaces and test it with significantly larger visually impaired population in diverse settings. We hope that PERCEPT will become a standard deployed in all indoor public spaces. PMID:23367251

  14. Interdisciplinary program for quantitative nondestructive evaluation. Semiannual report, 1 October 1981-31 March 1982

    SciTech Connect

    Not Available

    1982-01-01

    This report constitutes the semiannual report of the Air Force/Defense Advanced Research Project Agency research program in quantitative nondestructive evaluation covering the period October 1, 1981 to March 31, 1982. It is organized by projects, each of which contains the reports of individual investigations. Because the goals of the projects are largely such that strong interdisciplinary interactions are necessary in order to achieve them, the individual reports reflect a close cooperation between various investigators. Projects included in this year's effort are: application of ultrasonic QNDE to RFC window problems; electromagnetic detection and sizing; new technical opportunities; and new flaw detection techniques. Twenty-three project reports are presented.

  15. Evaluation of ViroCyt® Virus Counter for rapid filovirus quantitation.

    PubMed

    Rossi, Cynthia A; Kearney, Brian J; Olschner, Scott P; Williams, Priscilla L; Robinson, Camenzind G; Heinrich, Megan L; Zovanyi, Ashley M; Ingram, Michael F; Norwood, David A; Schoepp, Randal J

    2015-03-01

    Development and evaluation of medical countermeasures for diagnostics, vaccines, and therapeutics requires production of standardized, reproducible, and well characterized virus preparations. For filoviruses this includes plaque assay for quantitation of infectious virus, transmission electron microscopy (TEM) for morphology and quantitation of virus particles, and real-time reverse transcription PCR for quantitation of viral RNA (qRT-PCR). The ViroCyt® Virus Counter (VC) 2100 (ViroCyt, Boulder, CO, USA) is a flow-based instrument capable of quantifying virus particles in solution. Using a proprietary combination of fluorescent dyes that stain both nucleic acid and protein in a single 30 min step, rapid, reproducible, and cost-effective quantification of filovirus particles was demonstrated. Using a seed stock of Ebola virus variant Kikwit, the linear range of the instrument was determined to be 2.8E+06 to 1.0E+09 virus particles per mL with coefficient of variation ranging from 9.4% to 31.5% for samples tested in triplicate. VC particle counts for various filovirus stocks were within one log of TEM particle counts. A linear relationship was established between the plaque assay, qRT-PCR, and the VC. VC results significantly correlated with both plaque assay and qRT-PCR. These results demonstrated that the VC is an easy, fast, and consistent method to quantify filoviruses in stock preparations. PMID:25710889

  16. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  17. Dynamic phase differences based on quantitative phase imaging for the objective evaluation of cell behavior

    NASA Astrophysics Data System (ADS)

    Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim

    2015-11-01

    Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.

  18. Experimental Evaluation of Quantitative Diagnosis Technique for Hepatic Fibrosis Using Ultrasonic Phantom

    NASA Astrophysics Data System (ADS)

    Koriyama, Atsushi; Yasuhara, Wataru; Hachiya, Hiroyuki

    2012-07-01

    Since clinical diagnosis using ultrasonic B-mode images depends on the skill of the doctor, the realization of a quantitative diagnosis method using an ultrasound echo signal is highly required. We have been investigating a quantitative diagnosis technique, mainly for hepatic disease. In this paper, we present the basic experimental evaluation results on the accuracy of the proposed quantitative diagnosis technique for hepatic fibrosis by using a simple ultrasonic phantom. As a region of interest crossed on the boundary between two scatterer areas with different densities in a phantom, we can simulate the change of the echo amplitude distribution from normal tissue to fibrotic tissue in liver disease. The probability density function is well approximated by our fibrosis distribution model that is a mixture of normal and fibrotic tissue. The fibrosis parameters of the amplitude distribution model can be estimated relatively well at a mixture rate from 0.2 to 0.6. In the inversion processing, the standard deviation of the estimated fibrosis results at mixture ratios of less than 0.2 and larger than 0.6 are relatively large. Although the probability density is not large at high amplitude, the estimated variance ratio and mixture rate of the model are strongly affected by higher amplitude data.

  19. Risk evaluation of liquefaction on the site of Damien (Haiti)

    NASA Astrophysics Data System (ADS)

    Jean, B. J.; Boisson, D.; Thimus, J.; Schroeder, C.

    2013-12-01

    Under the proposed relocation of all faculties to the campus of Damien, owned by Université d'Etat d'Haïti (UEH), the Unité de Recherche en Géotechnique (URGéo) of the Faculté des Sciences (FDS) of UEH conducted several operations whose objective was to evaluate the risk of liquefaction on this site. This abstract presents a comprehensive and coherent manner the entire processus of assessing the risk of liquefaction. This evaluation was conducted mainly from seismic thechniques, laboratory tests and the response of a one-dimensional soil column. Then, we summarize the results of this evaluation on the various techniques through synthetic maps interpretations of MASW 1D and H/V and also measures on site response to seismic loading from the SPT test applied to evaluation of liquefaction potential.

  20. Evaluation of risk factors for degenerative joint disease associated with hip dysplasia in dogs.

    PubMed

    Smith, G K; Popovitch, C A; Gregor, T P; Shofer, F S

    1995-03-01

    Passive coxofemoral joint laxity of dogs, as quantitated by a distraction-stress radiographic method, may have important prognostic value in determining susceptibility to hip dysplasia. Data from 151 dogs, representing 13 breeds, were included in a logistic regression model to evaluate the contribution of factors such as age, breed, weight, sex, distraction index, and Norberg angle to the risk of developing degenerative joint disease (DJD) of the coxofemoral joint. Of the factors studied, the amount of passive hip laxity, as quantitated by the distraction index, was the most significant (P < 0.0001) determinant of the risk to develop DJD of the coxofemoral joint. In the longitudinal and cross-sectional components of the study, distraction index was a significant (P < 0.001) risk factor for DJD, irrespective of age at evaluation (4, 12, or 24 months). The strength of the hip laxity:DJD correlation increased with the age of dog. In contrast, the Norberg angle, a measure of hip laxity on the standard hip-extended radiograph, was not found to be a significant risk factor for DJD, either in the longitudinal or cross-sectional analyses. Breed-specific probability curves of DJD susceptibility indicated that German Shepherd Dogs had a significantly (P < 0.05) greater risk of developing DJD than did the pool of non-German Shepherd Dogs. The information derived from this statistical model will help to scientifically characterize the role of passive hip laxity as a component in the pathogenesis of DJD of the coxofemoral joint. PMID:7744684

  1. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural

  2. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  3. Experimental approaches for evaluating the invasion risk of biofuel crops

    NASA Astrophysics Data System (ADS)

    Flory, S. Luke; Lorentz, Kimberly A.; Gordon, Doria R.; Sollenberger, Lynn E.

    2012-12-01

    There is growing concern that non-native plants cultivated for bioenergy production might escape and result in harmful invasions in natural areas. Literature-derived assessment tools used to evaluate invasion risk are beneficial for screening, but cannot be used to assess novel cultivars or genotypes. Experimental approaches are needed to help quantify invasion risk but protocols for such tools are lacking. We review current methods for evaluating invasion risk and make recommendations for incremental tests from small-scale experiments to widespread, controlled introductions. First, local experiments should be performed to identify conditions that are favorable for germination, survival, and growth of candidate biofuel crops. Subsequently, experimental introductions in semi-natural areas can be used to assess factors important for establishment and performance such as disturbance, founder population size, and timing of introduction across variable habitats. Finally, to fully characterize invasion risk, experimental introductions should be conducted across the expected geographic range of cultivation over multiple years. Any field-based testing should be accompanied by safeguards and monitoring for early detection of spread. Despite the costs of conducting experimental tests of invasion risk, empirical screening will greatly improve our ability to determine if the benefits of a proposed biofuel species outweigh the projected risks of invasions.

  4. 77 FR 61446 - Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-09

    ... COMMISSION Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors..., ``Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors.'' DATES: Submit comments by... No. ML081430087) concerning the review of probabilistic risk assessment (PRA) information and...

  5. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  6. Rape Prevention with College Men: Evaluating Risk Status

    ERIC Educational Resources Information Center

    Stephens, Kari A.; George, William H.

    2009-01-01

    This study evaluates the effectiveness of a theoretically based rape prevention intervention with college men who were at high or low risk to perpetrate sexually coercive behavior. Participants (N = 146) are randomly assigned to the intervention or control group. Outcomes include rape myth acceptance, victim empathy, attraction to sexual…

  7. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2014-01-01

    Rock falls are common in Yosemite Valley, California, posing substantial hazard and risk to the approximately four million annual visitors to Yosemite National Park. Rock falls in Yosemite Valley over the past few decades have damaged structures and caused injuries within developed regions located on or adjacent to talus slopes highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock-fall hazard and risk in Yosemite Valley and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls as large as approximately 100,000 (cubic meters) in volume.

  8. A quantitative evaluation study of four-dimensional gated cardiac SPECT reconstruction.

    PubMed

    Jin, Mingwu; Yang, Yongyi; Niu, Xiaofeng; Marin, Thibault; Brankov, Jovan G; Feng, Bing; Pretorius, P Hendrik; King, Michael A; Wernick, Miles N

    2009-09-21

    In practice, gated cardiac SPECT images suffer from a number of degrading factors, including distance-dependent blur, attenuation, scatter and increased noise due to gating. Recently, we proposed a motion-compensated approach for four-dimensional (4D) reconstruction for gated cardiac SPECT and demonstrated that use of motion-compensated temporal smoothing could be effective for suppressing the increased noise due to lowered counts in individual gates. In this work, we further develop this motion-compensated 4D approach by also taking into account attenuation and scatter in the reconstruction process, which are two major degrading factors in SPECT data. In our experiments, we conducted a thorough quantitative evaluation of the proposed 4D method using Monte Carlo simulated SPECT imaging based on the 4D NURBS-based cardiac-torso (NCAT) phantom. In particular, we evaluated the accuracy of the reconstructed left ventricular myocardium using a number of quantitative measures including regional bias-variance analyses and wall intensity uniformity. The quantitative results demonstrate that use of motion-compensated 4D reconstruction can improve the accuracy of the reconstructed myocardium, which in turn can improve the detectability of perfusion defects. Moreover, our results reveal that while traditional spatial smoothing could be beneficial, its merit would become diminished with the use of motion-compensated temporal regularization. As a preliminary demonstration, we also tested our 4D approach on patient data. The reconstructed images from both simulated and patient data demonstrated that our 4D method can improve the definition of the LV wall. PMID:19724094

  9. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  10. Quantitation of aortic and mitral regurgitation in the pediatric population: evaluation by radionuclide angiocardiography

    SciTech Connect

    Hurwitz, R.A.; Treves, S.; Freed, M.; Girod, D.A.; Caldwell, R.L.

    1983-01-15

    The ability to quantitate aortic (AR) or mitral regurgitation (MR), or both, by radionuclide angiocardiography was evaluated in children and young adults at rest and during isometric exercise. Regurgitation was estimated by determining the ratio of left ventricular stroke volume to right ventricular stroke volume obtained during equilibrium ventriculography. The radionuclide measurement was compared with results of cineangiography, with good correlation between both studies in 47 of 48 patients. Radionuclide stroke volume ratio was used to classify severity: the group with equivocal regurgitation differed from the group with mild regurgitation (p less than 0.02); patients with mild regurgitation differed from those with moderate regurgitation (p less than 0.001); and those with moderate regurgitation differed from those with severe regurgitation (p less than 0.01). The stroke volume ratio was responsive to isometric exercise, remaining constant or increasing in 16 of 18 patients. After surgery to correct regurgitation, the stroke volume ratio significantly decreased from preoperative measurements in all 7 patients evaluated. Results from the present study demonstrate that a stroke volume ratio greater than 2.0 is compatible with moderately severe regurgitation and that a ratio greater than 3.0 suggests the presence of severe regurgitation. Thus, radionuclide angiocardiography should be useful for noninvasive quantitation of AR or MR, or both, helping define the course of young patients with left-side valvular regurgitation.

  11. Quantitative evaluation of noise reduction and vesselness filters for liver vessel segmentation on abdominal CTA images

    NASA Astrophysics Data System (ADS)

    Luu, Ha Manh; Klink, Camiel; Moelker, Adriaan; Niessen, Wiro; van Walsum, Theo

    2015-05-01

    Liver vessel segmentation in CTA images is a challenging task, especially in the case of noisy images. This paper investigates whether pre-filtering improves liver vessel segmentation in 3D CTA images. We introduce a quantitative evaluation of several well-known filters based on a proposed liver vessel segmentation method on CTA images. We compare the effect of different diffusion techniques i.e. Regularized Perona-Malik, Hybrid Diffusion with Continuous Switch and Vessel Enhancing Diffusion as well as the vesselness approaches proposed by Sato, Frangi and Erdt. Liver vessel segmentation of the pre-processed images is performed using a histogram-based region grown with local maxima as seed points. Quantitative measurements (sensitivity, specificity and accuracy) are determined based on manual landmarks inside and outside the vessels, followed by T-tests for statistic comparisons on 51 clinical CTA images. The evaluation demonstrates that all the filters make liver vessel segmentation have a significantly higher accuracy than without using a filter (p  <  0.05) Hybrid Diffusion with Continuous Switch achieves the best performance. Compared to the diffusion filters, vesselness filters have a greater sensitivity but less specificity. In addition, the proposed liver vessel segmentation method with pre-filtering is shown to perform robustly on a clinical dataset having a low contrast-to-noise of up to 3 (dB). The results indicate that the pre-filtering step significantly improves liver vessel segmentation on 3D CTA images.

  12. [Quantitative evaluation of acute myocardial infarction by In-111 antimyosin Fab myocardial imaging].

    PubMed

    Naruse, H; Morita, M; Itano, M; Yamamoto, J; Kawamoto, H; Fukutake, N; Ohyanagi, M; Iwasaki, T; Fukuchi, M

    1991-11-01

    For quantitative evaluation of acute myocardial infarction, In-111 antimyosin Fab myocardial imaging (InAM) was performed in 17 patients with myocardial infarction who underwent Tl-201 (TL) and Tc-99m pyrophosphate (PYP) myocardial imaging in acute phase. For calculating the infarct size, voxel counter method was used for analysis in PYP and InAM, and extent and severity score were used on bull's-eye polar map in TL. The most appropriate cut-off level ranged from 65 to 80% by the fundamental experiment using cardiac phantom. The cut-off level of 0.70 (InAM) and 0.65 (PYP) were used for clinical application of voxel counter analysis. The infarct size calculated by InAM and PYP was compared with wall motion abnormality index by echocardiography (WMAI), TL extent score, TL severity score, peak CK and sigma CK. Infarct size by InAM showed the following correlations with other indices. PYP: r = 0.26 (ns), TL extent score: r = 0.72 (p less than 0.01), TL severity score: r = 0.65 (p less than 0.05), WMAI: r = 0.69 (p less than 0.05). The infarct size by PYP did not show any correlations with these indices. Therefore, the infarct size by InAM showed better correlations with TL and WMAI than that of PYP. So InAM was considered superior to PYP for quantitative evaluation of acute myocardial infarction. PMID:1770642

  13. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia. PMID:25798757

  14. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise. PMID:26367657

  15. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  16. Early Prediction and Evaluation of Breast Cancer Response to Neoadjuvant Chemotherapy Using Quantitative DCE-MRI.

    PubMed

    Tudorica, Alina; Oh, Karen Y; Chui, Stephen Y-C; Roy, Nicole; Troxell, Megan L; Naik, Arpana; Kemmer, Kathleen A; Chen, Yiyi; Holtorf, Megan L; Afzal, Aneela; Springer, Charles S; Li, Xin; Huang, Wei

    2016-02-01

    The purpose is to compare quantitative dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) metrics with imaging tumor size for early prediction of breast cancer response to neoadjuvant chemotherapy (NACT) and evaluation of residual cancer burden (RCB). Twenty-eight patients with 29 primary breast tumors underwent DCE-MRI exams before, after one cycle of, at midpoint of, and after NACT. MRI tumor size in the longest diameter (LD) was measured according to the RECIST (Response Evaluation Criteria In Solid Tumors) guidelines. Pharmacokinetic analyses of DCE-MRI data were performed with the standard Tofts and Shutter-Speed models (TM and SSM). After one NACT cycle the percent changes of DCE-MRI parameters K(trans) (contrast agent plasma/interstitium transfer rate constant), ve (extravascular and extracellular volume fraction), kep (intravasation rate constant), and SSM-unique τi (mean intracellular water lifetime) are good to excellent early predictors of pathologic complete response (pCR) vs. non-pCR, with univariate logistic regression C statistics value in the range of 0.804 to 0.967. ve values after one cycle and at NACT midpoint are also good predictors of response, with C ranging 0.845 to 0.897. However, RECIST LD changes are poor predictors with C = 0.609 and 0.673, respectively. Post-NACT K(trans), τi, and RECIST LD show statistically significant (P < .05) correlations with RCB. The performances of TM and SSM analyses for early prediction of response and RCB evaluation are comparable. In conclusion, quantitative DCE-MRI parameters are superior to imaging tumor size for early prediction of therapy response. Both TM and SSM analyses are effective for therapy response evaluation. However, the τi parameter derived only with SSM analysis allows the unique opportunity to potentially quantify therapy-induced changes in tumor energetic metabolism. PMID:26947876

  17. A novel integrated approach to quantitatively evaluate the efficiency of extracellular polymeric substances (EPS) extraction process.

    PubMed

    Sun, Min; Li, Wen-Wei; Yu, Han-Qing; Harada, Hideki

    2012-12-01

    A novel integrated approach is developed to quantitatively evaluate the extracellular polymeric substances (EPS) extraction efficiency after taking into account EPS yield, EPS damage, and cell lysis. This approach incorporates grey relational analysis and fuzzy logic analysis, in which the evaluation procedure is established on the basis of grey relational coefficients generation, membership functions construction, and fuzzy rules description. The flocculation activity and DNA content of EPS are chosen as the two evaluation responses. To verify the feasibility and effectiveness of this integrated approach, EPS from Bacillus megaterium TF10 are extracted using five different extraction methods, and their extraction efficiencies are evaluated as one real case study. Based on the evaluation results, the maximal extraction grades and corresponding optimal extraction times of the five extraction methods are ordered as EDTA, 10 h > formaldehyde + NaOH, 60 min > heating, 120 min > ultrasonication, 30 min > H₂SO₄, 30 min > control. The proposed approach here offers an effective tool to select appropriate EPS extraction methods and determine the optimal extraction conditions. PMID:23064456

  18. Development and evaluation of an improved quantitative 90Y bremsstrahlung SPECT method

    PubMed Central

    Rong, Xing; Du, Yong; Ljungberg, Michael; Rault, Erwann; Vandenberghe, Stefaan; Frey, Eric C.

    2012-01-01

    Purpose: Yttrium-90 (90Y) is one of the most commonly used radionuclides in targeted radionuclide therapy (TRT). Since it decays with essentially no gamma photon emissions, surrogate radionuclides (e.g., 111In) or imaging agents (e.g., 99mTc MAA) are typically used for treatment planning. It would, however, be useful to image 90Y directly in order to confirm that the distributions measured with these other radionuclides or agents are the same as for the 90Y labeled agents. As a result, there has been a great deal of interest in quantitative imaging of 90Y bremsstrahlung photons using single photon emission computed tomography (SPECT) imaging. The continuous and broad energy distribution of bremsstrahlung photons, however, imposes substantial challenges on accurate quantification of the activity distribution. The aim of this work was to develop and evaluate an improved quantitative 90Y bremsstrahlung SPECT reconstruction method appropriate for these imaging applications. Methods: Accurate modeling of image degrading factors such as object attenuation and scatter and the collimator-detector response is essential to obtain quantitatively accurate images. All of the image degrading factors are energy dependent. Thus, the authors separated the modeling of the bremsstrahlung photons into multiple categories and energy ranges. To improve the accuracy, the authors used a bremsstrahlung energy spectrum previously estimated from experimental measurements and incorporated a model of the distance between 90Y decay location and bremsstrahlung emission location into the SIMIND code used to generate the response functions and kernels used in the model. This improved Monte Carlo bremsstrahlung simulation was validated by comparison to experimentally measured projection data of a 90Y line source. The authors validated the accuracy of the forward projection model for photons in the various categories and energy ranges using the validated Monte Carlo (MC) simulation method. The

  19. QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT

    EPA Science Inventory

    Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...

  20. Quantitative Risk-Benefit Analysis of Probiotic Use for Irritable Bowel Syndrome and Inflammatory Bowel Disease.

    PubMed

    Bennett, William E

    2016-04-01

    Probiotics have seen widespread use for a variety of gastrointestinal problems, especially in two common disorders: irritable bowel syndrome and inflammatory bowel disease. Since a wide variety of probiotic preparations has been used, and despite a large number of studies performed, a great deal of heterogeneity exists among them. Straightforward evidence-based recommendations for the use of probiotics in irritable bowel syndrome and inflammatory bowel disease have thus been difficult to formulate. In an effort to improve understanding of the risk-benefit balance of probiotics in these conditions, this study (1) queried the US FDA Adverse Event Reporting System (FAERS) database for all reported adverse drug events related to probiotics in 2013, and (2) constructed risk-benefit planes for both irritable bowel syndrome and inflammatory bowel disease using a geometric approximation of the confidence region between risk and benefit. The results show that adverse events from probiotics vary widely by disease, and when they occur, they are mild and may be difficult to distinguish from the natural history of the underlying disorders they are used to treat. The risk-benefit plane for irritable bowel syndrome straddles the risk-benefit threshold, so patients can expect a balance between a low chance of risk and also a low chance of benefit. The risk-benefit plane for inflammatory bowel disease largely lies above the risk-benefit threshold, so patients may expect more benefit than risk in most cases. More standardized and high-quality research is needed to improve our understanding of risk and benefit for these complex biopharmaceuticals. PMID:26467550

  1. Assessing the risk of impact of farming intensification on calcareous grasslands in Europe: a quantitative implementation of the MIRABEL framework.

    PubMed

    Petit, Sandrine; Elbersen, Berien

    2006-09-01

    Intensification of farming practices is still a major driver of biodiversity loss in Europe, despite the implementation of policies that aim to reverse this trend. A conceptual framework called MIRABEL was previously developed that enabled a qualitative and expert-based assessment of the impact of agricultural intensification on ecologically valuable habitats. We present a quantitative update of the previous assessment that uses newly available pan-European spatially explicit data on pressures and habitats at risk. This quantitative assessment shows that the number of calcareous grasslands potentially at risk of eutrophication and overgrazing is rapidly increasing in Europe. Decreases in nitrogen surpluses and stocking densities that occurred between 1990 and 2000 have rarely led to values that were below the ecological thresholds. At the same time, a substantial proportion of calcareous grassland that has so far experienced low values for indicators of farming intensification has faced increases between 1990 and 2000 and could well become at high risk from farming intensification in the near future. As such, this assessment is an early warning signal, especially for habitats located in areas that have traditionally been farmed extensively. When comparing the outcome of this assessment with the previous qualitative MIRABEL assessment, it appears that if pan-European data are useful to assess the intensity of the pressures, more work is needed to identify regional variations in the response of biodiversity to such pressures. This is where a qualitative approach based on regional expertise should be used to complement data-driven assessments. PMID:17240762

  2. Quantitative Risk Assessment for African Horse Sickness in Live Horses Exported from South Africa

    PubMed Central

    Sergeant, Evan S.

    2016-01-01

    African horse sickness (AHS) is a severe, often fatal, arbovirus infection of horses, transmitted by Culicoides spp. midges. AHS occurs in most of sub-Saharan Africa and is a significant impediment to export of live horses from infected countries, such as South Africa. A stochastic risk model was developed to estimate the probability of exporting an undetected AHS-infected horse through a vector protected pre-export quarantine facility, in accordance with OIE recommendations for trade from an infected country. The model also allows for additional risk management measures, including multiple PCR tests prior to and during pre-export quarantine and optionally during post-arrival quarantine, as well as for comparison of risk associated with exports from a demonstrated low-risk area for AHS and an area where AHS is endemic. If 1 million horses were exported from the low-risk area with no post-arrival quarantine we estimate the median number of infected horses to be 5.4 (95% prediction interval 0.5 to 41). This equates to an annual probability of 0.0016 (95% PI: 0.00015 to 0.012) assuming 300 horses exported per year. An additional PCR test while in vector-protected post-arrival quarantine reduced these probabilities by approximately 12-fold. Probabilities for horses exported from an area where AHS is endemic were approximately 15 to 17 times higher than for horses exported from the low-risk area under comparable scenarios. The probability of undetected AHS infection in horses exported from an infected country can be minimised by appropriate risk management measures. The final choice of risk management measures depends on the level of risk acceptable to the importing country. PMID:26986002

  3. Quantitative Risk Assessment for African Horse Sickness in Live Horses Exported from South Africa.

    PubMed

    Sergeant, Evan S; Grewar, John D; Weyer, Camilla T; Guthrie, Alan J

    2016-01-01

    African horse sickness (AHS) is a severe, often fatal, arbovirus infection of horses, transmitted by Culicoides spp. midges. AHS occurs in most of sub-Saharan Africa and is a significant impediment to export of live horses from infected countries, such as South Africa. A stochastic risk model was developed to estimate the probability of exporting an undetected AHS-infected horse through a vector protected pre-export quarantine facility, in accordance with OIE recommendations for trade from an infected country. The model also allows for additional risk management measures, including multiple PCR tests prior to and during pre-export quarantine and optionally during post-arrival quarantine, as well as for comparison of risk associated with exports from a demonstrated low-risk area for AHS and an area where AHS is endemic. If 1 million horses were exported from the low-risk area with no post-arrival quarantine we estimate the median number of infected horses to be 5.4 (95% prediction interval 0.5 to 41). This equates to an annual probability of 0.0016 (95% PI: 0.00015 to 0.012) assuming 300 horses exported per year. An additional PCR test while in vector-protected post-arrival quarantine reduced these probabilities by approximately 12-fold. Probabilities for horses exported from an area where AHS is endemic were approximately 15 to 17 times higher than for horses exported from the low-risk area under comparable scenarios. The probability of undetected AHS infection in horses exported from an infected country can be minimised by appropriate risk management measures. The final choice of risk management measures depends on the level of risk acceptable to the importing country. PMID:26986002

  4. [A quantitative assessment of health risk induced by occupational exposure to inorganic arsenic].

    PubMed

    Szymczak, W

    1997-01-01

    The risk of neoplastic disease, primarily lung cancer, induced by occupational, inhalation exposure to nonorganic arsenic was assessed. In order to identify individual risk in the linear dose-response relationship which would serve as a basis for the risk assessment among persons exposed occupationally, the author also analysed the latest epidemiological studies performed in Sweden, as well as repeated analyses of American studies. This allowed to diminish individual risk by several times. It is thought that a diminished value of individual risk is, in the light of the most up-to-date epidemiological studies, closer to the reality than the value proposed by the Environmental Protection Agency (EPA). Having the value of individual risk related to occupational exposure, equal 1.79 x 10(-4), lung cancer risk after forty years of employment under the exposure level within the range of currently binding MAC values for arsenic (0.05 mg/m3) accounts for 8.95 x 10(-3), thus slightly exceeding the adopted value of 1 x 10(3). Whereas a new value, proposed by the Expert Group for Chemical Factors of the International Commission for Updating the list of MAC and MAI values in 1996, equals 0.01, so the risk for a forty-year employment accounts for 1.79 x 10(-3), in fact the value corresponding to that already approved. In addition, the assessment indicated that smoking increases by 4-6 times the risk of lung cancer induced by exposure to arsenic. PMID:9558633

  5. Quantitative risk model for polycyclic aromatic hydrocarbon photoinduced toxicity in Pacific herring following the Exxon Valdez oil spill.

    PubMed

    Sellin Jeffries, Marlo K; Claytor, Carrie; Stubblefield, William; Pearson, Walter H; Oris, James T

    2013-05-21

    Phototoxicity occurs when exposure to ultraviolet radiation increases the toxicity of certain contaminants, including polycyclic aromatic hydrocarbons (PAHs). This study aimed to (1) develop a quantitative model to predict the risk of PAH phototoxicity to fish, (2) assess the predictive value of the model, and (3) estimate the risk of PAH phototoxicity to larval and young of year Pacific herring (Clupea pallasi) following the Exxon Valdez oil spill (EVOS) in Prince William Sound, Alaska. The model, in which median lethal times (LT50 values) are estimated from whole-body phototoxic PAH concentrations and ultraviolet A (UVA) exposure, was constructed from previously reported PAH phototoxicity data. The predictive value of this model was confirmed by the overlap of model-predicted and experimentally derived LT50 values. The model, along with UVA characterization data, was used to generate estimates for depths of de minimiz risk for PAH phototoxicity in young herring in 2003/2004 and immediately following the 1989 EVOS, assuming average and worst case conditions. Depths of de minimiz risk were estimated to be between 0 and 2 m deep when worst case UVA and PAH conditions were considered. A post hoc assessment determined that <1% of the young herring population would have been present at depths associated with significant risk of PAH phototoxicity in 2003/2004 and 1989. PMID:23600964

  6. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico. PMID:25494486

  7. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging☆

    PubMed Central

    Oishi, Kenichi; Faria, Andreia V.; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-01-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a “growth percentile chart,” which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced

  8. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2012-01-01

    caused injuries within developed regions located on or adjacent to talus slopes, highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock fall hazard and risk in Yosemite Valley (Wieczorek et al., 1998, 1999; Guzzetti et al., 2003; Wieczorek et al., 2008), and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls (Evans and Hungr, 1999), up to approximately 100,000 m3 in volume.

  9. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. PMID:26491992

  10. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  11. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  12. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  13. Research and Evaluations of the Health Aspects of Disasters, Part VIII: Risk, Risk Reduction, Risk Management, and Capacity Building.

    PubMed

    Birnbaum, Marvin L; Loretti, Alessandro; Daily, Elaine K; O'Rourke, Ann P

    2016-06-01

    There is a cascade of risks associated with a hazard evolving into a disaster that consists of the risk that: (1) a hazard will produce an event; (2) an event will cause structural damage; (3) structural damage will create functional damages and needs; (4) needs will create an emergency (require use of the local response capacity); and (5) the needs will overwhelm the local response capacity and result in a disaster (ie, the need for outside assistance). Each step along the continuum/cascade can be characterized by its probability of occurrence and the probability of possible consequences of its occurrence, and each risk is dependent upon the preceding occurrence in the progression from a hazard to a disaster. Risk-reduction measures are interventions (actions) that can be implemented to: (1) decrease the risk that a hazard will manifest as an event; (2) decrease the amounts of structural and functional damages that will result from the event; and/or (3) increase the ability to cope with the damage and respond to the needs that result from an event. Capacity building increases the level of resilience by augmenting the absorbing and/or buffering and/or response capacities of a community-at-risk. Risks for some hazards vary by the context in which they exist and by the Societal System(s) involved. Birnbaum ML , Loretti A , Daily EK , O'Rourke AP . Research and evaluations of the health aspects of disasters, part VIII: risk, risk reduction, risk management, and capacity building. Prehosp Disaster Med. 2016;31(3):300-308. PMID:27025980

  14. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  15. Evaluating the Risks: A Bernoulli Process Model of HIV Infection and Risk Reduction.

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Abramson, Paul R.

    1993-01-01

    A Bernoulli process model of human immunodeficiency virus (HIV) is used to evaluate infection risks associated with various sexual behaviors (condom use, abstinence, or monogamy). Results suggest that infection is best mitigated through measures that decrease infectivity, such as condom use. (SLD)

  16. Environmental Risks to Public Health in the United Arab Emirates: A Quantitative Assessment and Strategic Plan

    PubMed Central

    Farah, Zeinab S.

    2012-01-01

    Background: Environmental risks to health in the United Arab Emirates (UAE) have shifted rapidly from infectious to noninfectious diseases as the nation has developed at an unprecedented rate. In response to public concerns over newly emerging environmental risks, the Environment Agency–Abu Dhabi commissioned a multidisciplinary environmental health strategic planning project. Objectives: In order to develop the environmental health strategic plan, we sought to quantify the illnesses and premature deaths in the UAE attributable to 14 environmental pollutant categories, prioritize these 14 risk factors, and identify interventions. Methods: We estimated the disease burden imposed by each risk factor using an attributable fraction approach, and we prioritized the risks using an empirically tested stakeholder engagement process. We then engaged government personnel, scientists, and other stakeholders to identify interventions. Results: The UAE’s environmental disease burden is low by global standards. Ambient air pollution is the leading contributor to premature mortality [~ 650 annual deaths; 95% confidence interval (CI): 140, 1,400]. Risk factors leading to > 10,000 annual health care facility visits included occupational exposures, indoor air pollution, drinking water contamination, seafood contamination, and ambient air pollution. Among the 14 risks considered, on average, outdoor air pollution was ranked by the stakeholders as the highest priority (mean rank, 1.4; interquartile range, 1–2) and indoor air pollution as the second-highest priority (mean rank 3.3; interquartile range, 2–4). The resulting strategic plan identified 216 potential interventions for reducing environmental risks to health. Conclusions: The strategic planning exercise described here provides a framework for systematically deciding how to invest public funds to maximize expected returns in environmental health, where returns are measured in terms of reductions in a population

  17. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  18. Quantitative risk assessment of human salmonellosis and listeriosis related to the consumption of raw milk in Italy.

    PubMed

    Giacometti, Federica; Bonilauri, Paolo; Albonetti, Sabrina; Amatiste, Simonetta; Arrigoni, Norma; Bianchi, Manila; Bertasi, Barbara; Bilei, Stefano; Bolzoni, Giuseppe; Cascone, Giuseppe; Comin, Damiano; Daminelli, Paolo; Decastelli, Lucia; Merialdi, Giuseppe; Mioni, Renzo; Peli, Angelo; Petruzzelli, Annalisa; Tonucci, Franco; Bonerba, Elisabetta; Serraino, Andrea

    2015-01-01

    Two quantitative risk assessment (RA) models were developed to describe the risk of salmonellosis and listeriosis linked to consumption of raw milk sold in vending machines in Italy. Exposure assessment considered the official microbiological records monitoring raw milk samples from vending machines performed by the regional veterinary authorities from 2008 to 2011, microbial growth during storage, destruction experiments, consumption frequency of raw milk, serving size, and consumption preference. Two separate RA models were developed: one for the consumption of boiled milk and the other for the consumption of raw milk. The RA models predicted no human listeriosis cases per year either in the best or worst storage conditions and with or without boiling raw milk, whereas the annual estimated cases of salmonellosis depend on the dose-response relationships used in the model, the milk storage conditions, and consumer behavior in relation to boiling raw milk or not. For example, the estimated salmonellosis cases ranged from no expected cases, assuming that the entire population boiled milk before consumption, to a maximum of 980,128 cases, assuming that the entire population drank raw milk without boiling, in the worst milk storage conditions, and with the lowest dose-response model. The findings of this study clearly show how consumer behavior could affect the probability and number of salmonellosis cases and in general, the risk of illness. Hence, the proposed RA models emphasize yet again that boiling milk before drinking is a simple yet effective tool to protect consumers against the risk of illness inherent in the consumption of raw milk. The models may also offer risk managers a useful tool to identify or implement appropriate measures to control the risk of acquiring foodborne pathogens. Quantification of the risks associated with raw milk consumption is necessary from a public health perspective. PMID:25581173

  19. Quantitative estimation of landslide risk from rapid debris slides on natural slopes in the Nilgiri hills, India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2011-06-01

    A quantitative procedure for estimating landslide risk to life and property is presented and applied in a mountainous area in the Nilgiri hills of southern India. Risk is estimated for elements at risk located in both initiation zones and run-out paths of potential landslides. Loss of life is expressed as individual risk and as societal risk using F-N curves, whereas the direct loss of properties is expressed in monetary terms. An inventory of 1084 landslides was prepared from historical records available for the period between 1987 and 2009. A substantially complete inventory was obtained for landslides on cut slopes (1042 landslides), while for natural slopes information on only 42 landslides was available. Most landslides were shallow translational debris slides and debris flowslides triggered by rainfall. On natural slopes most landslides occurred as first-time failures. For landslide hazard assessment the following information was derived: (1) landslides on natural slopes grouped into three landslide magnitude classes, based on landslide volumes, (2) the number of future landslides on natural slopes, obtained by establishing a relationship between the number of landslides on natural slopes and cut slopes for different return periods using a Gumbel distribution model, (3) landslide susceptible zones, obtained using a logistic regression model, and (4) distribution of landslides in the susceptible zones, obtained from the model fitting performance (success rate curve). The run-out distance of landslides was assessed empirically using landslide volumes, and the vulnerability of elements at risk was subjectively assessed based on limited historic incidents. Direct specific risk was estimated individually for tea/coffee and horticulture plantations, transport infrastructures, buildings, and people both in initiation and run-out areas. Risks were calculated by considering the minimum, average, and maximum landslide volumes in each magnitude class and the

  20. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis

    PubMed Central

    Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-01-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  1. Performance evaluation of quantitative adiabatic (13)C NMR pulse sequences for site-specific isotopic measurements.

    PubMed

    Thibaudeau, Christophe; Remaud, Gérald; Silvestre, Virginie; Akoka, Serge

    2010-07-01

    (2)H/(1)H and (13)C/(12)C site-specific isotope ratios determined by NMR spectroscopy may be used to discriminate pharmaceutically active ingredients based on the synthetic process used in production. Extending the Site-specific Natural Isotope Fractionation NMR (SNIF-NMR) method to (13)C is highly beneficial for complex organic molecules when measurements of (2)H/(1)H ratios lead to poorly defined molecular fingerprints. The current NMR methodology to determine (13)C/(12)C site-specific isotope ratios suffers from poor sensitivity and long experimental times. In this work, several NMR pulse sequences based on polarization transfer were evaluated and optimized to measure precise quantitative (13)C NMR spectra within a short time. Adiabatic 180 degrees (1)H and (13)C pulses were incorporated into distortionless enhancement by polarization transfer (DEPT) and refocused insensitive nuclei enhanced by polarization transfer (INEPT) to minimize the influence of 180 degrees pulse imperfections and of off-resonance effects on the precision of the measured (13)C peak areas. The adiabatic DEPT sequence was applied to draw up a precise site-specific (13)C isotope profile of ibuprofen. A modified heteronuclear cross-polarization (HCP) experiment featuring (1)H and (13)C spin-locks with adiabatic 180 degrees pulses is also introduced. This sequence enables efficient magnetization transfer across a wide (13)C frequency range although not enough for an application in quantitative (13)C isotopic analysis. PMID:20527737

  2. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  3. A method for the quantitative evaluation of SAR distribution in deep regional hyperthermia.

    PubMed

    Baroni, C; Giri, M G; Meliadó, G; Maluta, S; Chierego, G

    2001-01-01

    The Specific Absorption Rate (SAR) distribution pattern visualization by a matrix of E-field light-emitting sensors has demonstrated to be a useful tool to evaluate the characteristics of the applicators used in deep regional hyperthermia and to perform a quality assurance programme. A method to quantify the SAR from photographs of the sensor array--the so-called 'Power Stepping Technique'--has already been proposed. This paper presents a new approach to the quantitative determination of the SAR profiles in a liquid phantom exposed to electromagnetic fields from the Sigma-60 applicator (BSD-2000 system for deep regional hyperthermia). The method is based on the construction of a 'calibration curve' modelling the light-output of an E-field sensor as a function of the supplied voltage and on the use of a reference light source to 'normalize' the light-output readings from the photos of the sensor array, in order to minimize the errors introduced by the non-uniformity of the photographic process. Once the calibration curve is obtained, it is possible, with only one photo, to obtain the quantitative SAR distribution in the operating conditions. For this reason, this method is suitable for equipment characterization and also for the control of the repeatability of power deposition in time. PMID:11587076

  4. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  5. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  6. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  7. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis.

    PubMed

    Fujiuchi, Satoru; Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-06-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  8. A front-of-pack nutrition logo: a quantitative and qualitative process evaluation in the Netherlands.

    PubMed

    Vyth, Ellis L; Steenhuis, Ingrid H M; Mallant, Sanne F; Mol, Zinzi L; Brug, Johannes; Temminghoff, Marcel; Feunekes, Gerda I; Jansen, Leon; Verhagen, Hans; Seidell, Jacob C

    2009-01-01

    This study aimed to perform a quantitative and qualitative process evaluation of the introduction of the Choices logo, a front-of-pack nutrition logo on products with a favorable product composition, adopted by many food producers, retail and food service organizations, conditionally endorsed by the Dutch government, validated by scientists, and in the process of international dissemination. An online questionnaire was sent to adult consumers 4 months after the introduction of the logo (n = 1,032) and 1 year later (n = 1,127). Additionally, seven consumer focus groups (n = 41) were conducted to provide more insight into the questionnaire responses. Quantitative analyses showed that exposure to the logo had significantly increased. Elderly and obese respondents reported to be more in need of a logo than younger and normal-weight individuals. Women perceived the logo more attractive and credible than men did. Further qualitative analyses indicated that the logo's credibility would improve if it became known that governmental and scientific authorities support it. Elderly respondents indicated that they needed a logo due to health concerns. Consumers interested in health reported that they used the logo. Further research focusing on specific target groups, forming healthful diets, and health outcomes is needed to investigate the effectiveness of the Choices logo. PMID:19851915

  9. 78 FR 30313 - Standardizing and Evaluating Risk Evaluation and Mitigation Strategies; Notice of Public Meeting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-22

    ... standardization and assessment of risk evaluation and mitigation strategies (REMS) for drug and biological... standardization and evaluation, FDA will hold a public meeting to give stakeholders, including health care... number for the public meeting as follows: ``Docket No. FDA-2013-N-0502, ``Standardization and...

  10. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. PMID:26422298

  11. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era

    SciTech Connect

    Chiu, Weihsueh A.; Euling, Susan Y.; Scott, Cheryl Siegel; Subramaniam, Ravi P.

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA) — i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on “augmentation” of weight of evidence — using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards “integration” of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for “expansion” of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual “reorientation” of QRA towards approaches that more directly link environmental exposures to human outcomes.

  12. [Quantitative evaluation of film-screen combinations for x-ray diagnosis].

    PubMed

    Bronder, T; Heinze-Assmann, R

    1988-05-01

    The properties of screen/film combinations for radiographs set a lower limit for the x-ray exposure of the patient and an upper limit for the quality of the x-ray picture. Sensitivity, slope and resolution of different screen/film combinations were determined using a measuring phantom which was developed in the PTB. For all screens used the measurements show the same relation between screen sensitivity and resolution. This allows quantitative evaluation of image quality. A classification scheme derived from these results facilitates the selection of screen/film combinations for practical use. In addition for quality assurance gross differences in material properties and conditions of film development can be detected with the aid of the measuring phantom. PMID:3399512

  13. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  14. Rapid Risk-Based Evaluation of Competing Conceptual Designs

    SciTech Connect

    Bott, T.F.; Butner, J.M.

    1999-08-22

    In this paper, the authors have shown how a qualitative analysis can provide good input to a risk reduction design problem. Traditionally qualitative analyses such as the FMEA can be supplemented by qualitative fault trees and event trees to produce logic models of the accident sequences for the different design options. These models can be compared using rule-based manipulations of qualitative branch point probabilities. A qualitative evaluation of other considerations such as collateral safety effects, operational impacts and worker-safety impacts can provide a more complete picture of the trade-off between options. The authors believe that their risk-reduction analysis approach that combines logic models with qualitative and possibility metrics provides an excellent tool for incorporating safety concerns rapidly and effectively into a conceptual design evaluation.

  15. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs. PMID:26076424

  16. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  17. Technology efficacy in active prosthetic knees for transfemoral amputees: a quantitative evaluation.

    PubMed

    El-Sayed, Amr M; Hamzaid, Nur Azah; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  18. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  19. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet. PMID:12386760

  20. Quantitative analysis of topoisomerase II{alpha} to rapidly evaluate cell proliferation in brain tumors

    SciTech Connect

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A. . E-mail: jat@kuhp.kyoto-u.ac.jp

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase II{alpha} (topo II{alpha}), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo II{alpha} mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo II{alpha} mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo II{alpha} mRNA was significantly correlated with its immuno-staining index (p < 0.0001, r = 0.9077). Furthermore, it sharply detected that topo II{alpha} mRNA decreased in growth-inhibited glioma cell. These results support that topo II{alpha} mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  1. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  2. Quantitative evaluation of automatic methods for lesions detection in breast ultrasound images

    NASA Astrophysics Data System (ADS)

    Marcomini, Karem D.; Schiabel, Homero; Carneiro, Antonio Adilton O.

    2013-02-01

    Ultrasound (US) is a useful diagnostic tool to distinguish benign from malignant breast masses, providing more detailed evaluation in dense breasts. Due to the subjectivity in the images interpretation, computer-aid diagnosis (CAD) schemes have been developed, increasing the mammography analysis process to include ultrasound images as complementary exams. As one of most important task in the evaluation of this kind of images is the mass detection and its contours interpretation, automated segmentation techniques have been investigated in order to determine a quite suitable procedure to perform such an analysis. Thus, the main goal in this work is investigating the effect of some processing techniques used to provide information on the determination of suspicious breast lesions as well as their accurate boundaries in ultrasound images. In tests, 80 phantom and 50 clinical ultrasound images were preprocessed, and 5 segmentation techniques were tested. By using quantitative evaluation metrics the results were compared to a reference image delineated by an experienced radiologist. A self-organizing map artificial neural network has provided the most relevant results, demonstrating high accuracy and low error rate in the lesions representation, corresponding hence to the segmentation process for US images in our CAD scheme under tests.

  3. Quantitative Ultrasonic Evaluation of Radiation-Induced Late Tissue Toxicity: Pilot Study of Breast Cancer Radiotherapy

    SciTech Connect

    Liu Tian; Zhou Jun; Yoshida, Emi J.; Woodhouse, Shermian A.; Schiff, Peter B.; Wang, Tony J.C.; Lu Zhengfeng; Pile-Spellman, Eliza; Zhang Pengpeng; Kutcher, Gerald J.

    2010-11-01

    Purpose: To investigate the use of advanced ultrasonic imaging to quantitatively evaluate normal-tissue toxicity in breast-cancer radiation treatment. Methods and Materials: Eighteen breast cancer patients who received radiation treatment were enrolled in an institutional review board-approved clinical study. Radiotherapy involved a radiation dose of 50.0 to 50.4 Gy delivered to the entire breast, followed by an electron boost of 10.0 to 16.0 Gy delivered to the tumor bed. Patients underwent scanning with ultrasound during follow-up, which ranged from 6 to 94 months (median, 22 months) postradiotherapy. Conventional ultrasound images and radio-frequency (RF) echo signals were acquired from treated and untreated breasts. Three ultrasound parameters, namely, skin thickness, Pearson coefficient, and spectral midband fit, were computed from RF signals to measure radiation-induced changes in dermis, hypodermis, and subcutaneous tissue, respectively. Ultrasound parameter values of the treated breast were compared with those of the untreated breast. Ultrasound findings were compared with clinical assessment using Radiation Therapy Oncology Group (RTOG) late-toxicity scores. Results: Significant changes were observed in ultrasonic parameter values of the treated vs. untreated breasts. Average skin thickness increased by 27.3%, from 2.05 {+-} 0.22mm to 2.61 {+-} 0.52mm; Pearson coefficient decreased by 31.7%, from 0.41 {+-} 0.07 to 0.28 {+-} 0.05; and midband fit increased by 94.6%, from -0.92 {+-} 7.35 dB to 0.87 {+-} 6.70 dB. Ultrasound evaluations were consistent with RTOG scores. Conclusions: Quantitative ultrasound provides a noninvasive, objective means of assessing radiation-induced changes to the skin and subcutaneous tissue. This imaging tool will become increasingly valuable as we continue to improve radiation therapy technique.

  4. Quantitative evaluation of interaction force between functional groups in protein and polymer brush surfaces.

    PubMed

    Sakata, Sho; Inoue, Yuuki; Ishihara, Kazuhiko

    2014-03-18

    To understand interactions between polymer surfaces and different functional groups in proteins, interaction forces were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Various polymer brush surfaces were systematically prepared by surface-initiated atom transfer radical polymerization as well-defined model surfaces to understand protein adsorption behavior. The polymer brush layers consisted of phosphorylcholine groups (zwitterionic/hydrophilic), trimethylammonium groups (cationic/hydrophilic), sulfonate groups (anionic/hydrophilic), hydroxyl groups (nonionic/hydrophilic), and n-butyl groups (nonionic/hydrophobic) in their side chains. The interaction forces between these polymer brush surfaces and different functional groups (carboxyl groups, amino groups, and methyl groups, which are typical functional groups existing in proteins) were quantitatively evaluated by force-versus-distance curve measurements using atomic force microscopy with a functional-group-functionalized cantilever. Furthermore, the amount of adsorbed protein on the polymer brush surfaces was quantified by surface plasmon resonance using albumin with a negative net charge and lysozyme with a positive net charge under physiological conditions. The amount of proteins adsorbed on the polymer brush surfaces corresponded to the interaction forces generated between the functional groups on the cantilever and the polymer brush surfaces. The weakest interaction force and least amount of protein adsorbed were observed in the case of the polymer brush surface with phosphorylcholine groups in the side chain. On the other hand, positive and negative surfaces generated strong forces against the oppositely charged functional groups. In addition, they showed significant adsorption with albumin and lysozyme, respectively. These results indicated that the interaction force at the functional group level might be

  5. Quantitative evaluation of his-tag purification and immunoprecipitation of tristetraprolin and its mutant proteins from transfected human cells

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Histidine (His)-tag is widely used for affinity purification of recombinant proteins, but the yield and purity of expressed proteins are quite different. Little information is available about quantitative evaluation of this procedure. The objective of the current study was to evaluate the His-tag pr...

  6. Quantitative evaluation of susceptibility effects caused by dental materials in head magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Strocchi, S.; Ghielmi, M.; Basilico, F.; Macchi, A.; Novario, R.; Ferretti, R.; Binaghi, E.

    2016-03-01

    This work quantitatively evaluates the effects induced by susceptibility characteristics of materials commonly used in dental practice on the quality of head MR images in a clinical 1.5T device. The proposed evaluation procedure measures the image artifacts induced by susceptibility in MR images by providing an index consistent with the global degradation as perceived by the experts. Susceptibility artifacts were evaluated in a near-clinical setup, using a phantom with susceptibility and geometric characteristics similar to that of a human head. We tested different dentist materials, called PAL Keramit, Ti6Al4V-ELI, Keramit NP, ILOR F, Zirconia and used different clinical MR acquisition sequences, such as "classical" SE and fast, gradient, and diffusion sequences. The evaluation is designed as a matching process between reference and artifacts affected images recording the same scene. The extent of the degradation induced by susceptibility is then measured in terms of similarity with the corresponding reference image. The matching process involves a multimodal registration task and the use an adequate similarity index psychophysically validated, based on correlation coefficient. The proposed analyses are integrated within a computer-supported procedure that interactively guides the users in the different phases of the evaluation method. 2-Dimensional and 3-dimensional indexes are used for each material and each acquisition sequence. From these, we drew a ranking of the materials, averaging the results obtained. Zirconia and ILOR F appear to be the best choice from the susceptibility artefacts point of view, followed, in order, by PAL Keramit, Ti6Al4V-ELI and Keramit NP.

  7. Field evaluation of an avian risk assessment model

    USGS Publications Warehouse

    Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.

    2006-01-01

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.

  8. Evaluation of potential risks from ash disposal site leachate

    SciTech Connect

    Mills, W.B.; Loh, J.Y.; Bate, M.C.; Johnson, K.M.

    1999-04-01

    A risk-based approach is used to evaluate potential human health risks associated with a discharge from an ash disposal site into a small stream. The RIVRISK model was used to estimate downstream concentrations and corresponding risks. The modeling and risk analyses focus on boron, the constituent of greatest potential concern to public health at the site investigated, in Riddle Run, Pennsylvania. Prior to performing the risk assessment, the model is validated by comparing observed and predicted results. The comparison is good and an uncertainty analysis is provided to explain the comparison. The hazard quotient (HQ) for boron is predicted to be greater than 1 at presently regulated compliance points over a range of flow rates. The reference dose (RfD) currently recommended by the United States Environmental Protection Agency (US EPA) was used for the analyses. However, the toxicity of boron as expressed by the RfD is now under review by both the U.S. EPA and the World Health Organization. Alternative reference doses being examined would produce predicted boron hazard quotients of less than 1 at nearly all flow conditions.

  9. Quantitative risk assessment of CO2 transport by pipelines--a review of uncertainties and their impacts.

    PubMed

    Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André

    2010-05-15

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. PMID:20022693

  10. Quantitative risk assessment to compare the risk of rabies entering the UK from Turkey via quarantine, the Pet Travel Scheme and the EU Pet Movement Policy.

    PubMed

    Ramnial, V; Kosmider, R; Aylan, O; Freuling, C; Müller, T; Fooks, A R

    2010-08-01

    Rabies was eradicated from the UK in 1922 through strict controls of dog movement and investigation of every incident of disease. Amendments were made to the UK quarantine laws and the Pet Travel Scheme (PETS) was subsequently introduced in 2000 for animals entering the UK from qualifying listed countries. European Regulation 998/2003 on the non-commercial movement of pet animals initiated the European Union Pet Movement Policy (EUPMP) in July 2004. The introduction of EUPMP harmonized the movement of pet animals within the EU (EUPMP(listed)) but raised the possibility of domestic animals entering the UK from a non-EU state where rabies is endemic (EUPMP(unlisted)). A quantitative risk assessment was developed to estimate the risk of rabies entering the UK from Turkey via companion animals that are incubating the disease and enter through PETS or EUPMP compared to quarantine. Specifically, the risk was assessed by estimating the annual probability of rabies entering the UK and the number of years between rabies entries for each scheme. The model identified that the probability of rabies entering the UK via the three schemes is highly dependent on compliance. If 100% compliance is assumed, PETS and EUPMP(unlisted) (at the current level of importation) present a lower risk than quarantine, i.e. the number of years between rabies entry is more than 170 721 years for PETS and 60 163 years for EUPMP(unlisted) compared to 41 851 years for quarantine (with 95% certainty). If less than 100% compliance is assumed, PETS and EUPMP(unlisted) (at the current level of importation) present a higher risk. In addition, EUPMP(listed) and EUPMP(unlisted) (at an increased level of importation) present a higher risk than quarantine or PETS at 100% compliance and at an uncertain level of compliance. PMID:20018127

  11. Quantitative evaluation of 3D dosimetry for stereotactic volumetric-modulated arc delivery using COMPASS.

    PubMed

    Vikraman, Subramani; Manigandan, Durai; Karrthick, Karukkupalayam Palaniappan; Sambasivaselli, Raju; Senniandavar, Vellaingiri; Ramu, Mahendran; Rajesh, Thiyagarajan; Lutz, Muller; Muthukumaran, Manavalan; Karthikeyan, Nithyanantham; Tejinder, Kataria

    2015-01-01

    The purpose of this study was to evaluate quantitatively the patient-specific 3D dosimetry tool COMPASS with 2D array MatriXX detector for stereotactic volumetric-modulated arc delivery. Twenty-five patients CT images and RT structures from different sites (brain, head & neck, thorax, abdomen, and spine) were taken from CyberKnife Multiplan planning system for this study. All these patients underwent radical stereotactic treatment in CyberKnife. For each patient, linac based volumetric-modulated arc therapy (VMAT) stereotactic plans were generated in Monaco TPS v3.1 using Elekta Beam Modulator MLC. Dose prescription was in the range of 5-20 Gy per fraction. Target prescription and critical organ constraints were tried to match the delivered treatment plans. Each plan quality was analyzed using conformity index (CI), conformity number (CN), gradient Index (GI), target coverage (TC), and dose to 95% of volume (D95). Monaco Monte Carlo (MC)-calculated treatment plan delivery accuracy was quantitatively evaluated with COMPASS-calculated (CCA) dose and COMPASS indirectly measured (CME) dose based on dose-volume histogram metrics. In order to ascertain the potential of COMPASS 3D dosimetry for stereotactic plan delivery, 2D fluence verification was performed with MatriXX using MultiCube phantom. Routine quality assurance of absolute point dose verification was performed to check the overall delivery accuracy. Quantitative analyses of dose delivery verification were compared with pass and fail criteria of 3 mm and 3% distance to agreement and dose differences. Gamma passing rate was compared with 2D fluence verification from MatriXX with MultiCube. Comparison of COMPASS reconstructed dose from measured fluence and COMPASS computed dose has shown a very good agreement with TPS calculated dose. Each plan was evaluated based on dose volume parameters for target volumes such as dose at 95% of volume (D95) and average dose. For critical organs dose at 20% of volume (D20), dose

  12. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  13. Quantitative assessment of cumulative carcinogenic risk for multiple genotoxic impurities in a new drug substance.

    PubMed

    Bercu, Joel P; Hoffman, Wherly P; Lee, Cindy; Ness, Daniel K

    2008-08-01

    In pharmaceutical development, significant effort is made to minimize the carcinogenic potential of new drug substances (NDS). This involves appropriate genotoxicity and carcinogenicity testing of the NDS, and understanding the genotoxic potential of its impurities. Current available guidance recommends the use of the threshold of toxicological concern (TTC) for a single impurity where mutagenicity but no carcinogenicity information exists. Despite best efforts, the presence of more than one genotoxic impurity in an NDS may occur at trace levels. This paper repeats the analysis performed by others for a single genotoxic compound, but also uses statistical simulations to assess the impact on cancer risk for a mixture of genotoxic compounds. In summary, with the addition of multiple impurities all controlled to the TTC, an increase in cancer risk was observed. This increase is relatively small when considering the conservative assumptions of the TTC. If structurally similar compounds had an assumed strong correlation (+/-10-fold from the first randomly selected impurity) in cancer potency, the resulting cancer risk was not negatively impacted. Findings based on probabilistic analysis here can be very useful in making appropriate decisions about risk management of multiple genotoxic impurities measured in the final drug substance. PMID:18550240

  14. Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments

    EPA Science Inventory

    The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...

  15. Quantitative assessment of the association between CYP17 rs743572 polymorphism and prostate cancer risk.

    PubMed

    Wang, Yinglei; Zhang, YingYing; Meng, Haihong; Hou, Xianghua; Li, Zhonghai; Liu, Qingpeng; Meng, Lin

    2015-03-01

    Published data on the association between CYP17 rs743572 polymorphism and risk of PC showed inconclusive results. The aim of this study was to further estimate the pooled effect size of rs743572 polymorphism and PC progression via large-scale meta-analysis. We searched the case-control studies of rs743572 polymorphism and PC risk in PubMed, Embase, and Web of Science databases up to February 2014. Odds ratios (ORs) along with 95 % confidence intervals (CIs) were pooled by means of both fixed effects model and random effects model. A total of 38 publications consisting of 42 studies with 15,735 cases and 17,825 controls were included in this meta-analysis. Overall, no significant association was found between rs743572 polymorphism and PC risk. Stratified analyses by control source and sample size did not provide significant results. However, there was a borderline association in African population under A2A2 versus A1A2 + A1A1 genetic model (OR = 1.39, 95 % CI: 1.01-1.92, P = 0.975, I (2) = 0.0 %). Results from the current meta-analysis suggested that CYP17 rs743572 polymorphism might modify the risk of PC in the subjects of African decent. PMID:25323563

  16. Development of a Knee-gap Force Measurement Device to Evaluate Quantitative Lower Limb Muscular Strength of the Elderly

    NASA Astrophysics Data System (ADS)

    Yamashita, Kazuhiko; Imaizumi, Kazuya; Iwakami, Yumi; Sato, Mitsuru; Nakajima, Sawako; Ino, Shuichi; Koyama, Hironori; Kawasumi, Masashi; Ifukube, Toru

    Falling is one of the most serious problems for the elderly. It is thought that lower limb muscular strength greatly affects falls of the elderly. The aim of this study is to develop a safe, easy-to-use and quantitative device of knee-gap force measurement for evaluation of the lower limb muscular strength, and additionally, we examined it for efficiency. We examined from the three viewpoints. In the results, 1. the knee-gap force is clearly associated with the strength of muscle contraction estimated by electromyogram in each muscle for the hip joint adductors. Therefore, the proposed device for the measurement of knee-gap force correctly estimates the activity of the hip joint adductors, which is closely related with the activities of daily living. 2.The results of knee-gap force measured from 170 people aging from middle age to elderly, including some persons who are suffering from physical frailness on a clinical estimation. In the group of healthy elderly knee-gap force was decreased by 16%, while that of the physically frail elderly was decreased by 34% in comparison to middle age.3. Furthermore, the correlation coefficient between the knee-gap force and 10m obstacle walking time was found to be -0.57 (negative correlation). It means that the ambulatory ability is decreased along with the knee-gap force being decreased. This indicates a possibility easily to estimate risk of falling by the knee-gap force, because the decrease of lower limb muscular strength and ambulatory ability is a factor of increased falling risk.

  17. Quantitative Approach for Incorporating Methylmercury Risks and Omega-3 Fatty Acid Benefits in Developing Species-Specific Fish Consumption Advice

    PubMed Central

    Ginsberg, Gary L.; Toal, Brian F.

    2009-01-01

    Background Despite general agreement about the toxicity of methylmercury (MeHg), fish consumption advice remains controversial. Concerns have been raised that negative messages will steer people away from fish and omega-3 fatty acid (FA) benefits. One approach is to provide advice for individual species that highlights beneficial fish while cautioning against riskier fish. Objectives Our goal in this study was to develop a method to quantitatively analyze the net risk/benefit of individual fish species based on their MeHg and omega-3 FA content. Methods We identified dose–response relationships for MeHg and omega-3 FA effects on coronary heart disease (CHD) and neurodevelopment. We used the MeHg and omega-3 FA content of 16 commonly consumed species to calculate the net risk/benefit for each species. Results Estimated omega-3 FA benefits outweigh MeHg risks for some species (e.g., farmed salmon, herring, trout); however, the opposite was true for others (swordfish, shark). Other species were associated with a small net benefit (e.g., flounder, canned light tuna) or a small net risk (e.g., canned white tuna, halibut). These results were used to place fish into one of four meal frequency categories, with the advice tentative because of limitations in the underlying dose–response information. Separate advice appears warranted for the neurodevelopmental risk group versus the cardiovascular risk group because we found a greater net benefit from fish consumption for the cardiovascular risk group. Conclusions This research illustrates a framework for risk/benefit analysis that can be used to develop categories of consumption advice ranging from “do not eat” to “unlimited,” with the caveat that unlimited may need to be tempered for certain fish (e.g., farm-raised salmon) because of other contaminants and end points (e.g., cancer risk). Uncertainties exist in the underlying dose–response relationships, pointing in particular to the need for more research on

  18. Human-Associated Fecal Quantitative Polymerase Chain ReactionMeasurements and Simulated Risk of Gastrointestinal Illness in Recreational Waters Contaminated with Raw Sewage

    EPA Science Inventory

    We used quantitative microbial risk assessment (QMRA) to estimate the risk of gastrointestinal (GI) illness associated with swimming in recreational waters containing different concentrations of human-associated fecal qPCR markers from raw sewage– HF183 and HumM2. The volume/volu...

  19. Evaluation of the Reproductive and Developmental Risks of Caffeine

    PubMed Central

    Brent, Robert L; Christian, Mildred S; Diener, Robert M

    2011-01-01

    A risk analysis of in utero caffeine exposure is presented utilizing epidemiological studies and animal studies dealing with congenital malformation, pregnancy loss, and weight reduction. These effects are of interest to teratologists, because animal studies are useful in their evaluation. Many of the epidemiology studies did not evaluate the impact of the “pregnancy signal,” which identifies healthy pregnancies and permits investigators to identify subjects with low pregnancy risks. The spontaneous abortion epidemiology studies were inconsistent and the majority did not consider the confounding introduced by not considering the pregnancy signal. The animal studies do not support the concept that caffeine is an abortafacient for the wide range of human caffeine exposures. Almost all the congenital malformation epidemiology studies were negative. Animal pharmacokinetic studies indicate that the teratogenic plasma level of caffeine has to reach or exceed 60 µg/ml, which is not attainable from ingesting large amounts of caffeine in foods and beverages. No epidemiological study described the “caffeine teratogenic syndrome.” Six of the 17 recent epidemiology studies dealing with the risk of caffeine and fetal weight reduction were negative. Seven of the positive studies had growth reductions that were clinically insignificant and none of the studies cited the animal literature. Analysis of caffeine's reproductive toxicity considers reproducibility and plausibility of clinical, epidemiological, and animal data. Moderate or even high amounts of beverages and foods containing caffeine do not increase the risks of congenital malformations, miscarriage or growth retardation. Pharmacokinetic studies markedly improve the ability to perform the risk analyses. Birth Defects Res (Part B) 92:152–187, 2011. © 2011 Wiley-Liss, Inc. PMID:21370398

  20. Evaluations of Risks from the Lunar and Mars Radiation Environments

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Hayat, Matthew J.; Feiveson, Alan H.; Cucinotta, Francis A.

    2008-01-01

    Protecting astronauts from the space radiation environments requires accurate projections of radiation in future space missions. Characterization of the ionizing radiation environment is challenging because the interplanetary plasma and radiation fields are modulated by solar disturbances and the radiation doses received by astronauts in interplanetary space are likewise influenced. The galactic cosmic radiation (GCR) flux for the next solar cycle was estimated as a function of interplanetary deceleration potential, which has been derived from GCR flux and Climax neutron monitor rate measurements over the last 4 decades. For the chaotic nature of solar particle event (SPE) occurrence, the mean frequency of SPE at any given proton fluence threshold during a defined mission duration was obtained from a Poisson process model using proton fluence measurements of SPEs during the past 5 solar cycles (19-23). Analytic energy spectra of 34 historically large SPEs were constructed over broad energy ranges extending to GeV. Using an integrated space radiation model (which includes the transport codes HZETRN [1] and BRYNTRN [2], and the quantum nuclear interaction model QMSFRG[3]), the propagation and interaction properties of the energetic nucleons through various media were predicted. Risk assessment from GCR and SPE was evaluated at the specific organs inside a typical spacecraft using CAM [4] model. The representative risk level at each event size and their standard deviation were obtained from the analysis of 34 SPEs. Risks from different event sizes and their frequency of occurrences in a specified mission period were evaluated for the concern of acute health effects especially during extra-vehicular activities (EVA). The results will be useful for the development of an integrated strategy of optimizing radiation protection on the lunar and Mars missions. Keywords: Space Radiation Environments; Galactic Cosmic Radiation; Solar Particle Event; Radiation Risk; Risk

  1. Quantitative evaluation of fiber fuse initiation with exposure to arc discharge provided by a fusion splicer

    NASA Astrophysics Data System (ADS)

    Todoroki, Shin-Ichi

    2016-05-01

    The optical communication industry and power-over-fiber applications face a dilemma as a result of the expanding demand of light power delivery and the potential risks of high-power light manipulation including the fiber fuse phenomenon, a continuous destruction of the fiber core pumped by the propagating light and triggered by a heat-induced strong absorption of silica glass. However, we have limited knowledge on its initiation process in the viewpoint of energy flow in the reactive area. Therefore, the conditions required for a fiber fuse initiation in standard single-mode fibers were determined quantitatively, namely the power of a 1480 nm fiber laser and the arc discharge intensity provided by a fusion splicer for one second as an outer heat source. Systematic investigation on the energy flow balance between these energy sources revealed that the initiation process consists of two steps; the generation of a precursor at the heated spot and the transition to a stable fiber fuse. The latter step needs a certain degree of heat accumulation at the core where waveguide deformation is ongoing competitively. This method is useful for comparing the tolerance to fiber fuse initiation among various fibers with a fixed energy amount that was not noticed before.

  2. Quantitative evaluation of fiber fuse initiation with exposure to arc discharge provided by a fusion splicer.

    PubMed

    Todoroki, Shin-Ichi

    2016-01-01

    The optical communication industry and power-over-fiber applications face a dilemma as a result of the expanding demand of light power delivery and the potential risks of high-power light manipulation including the fiber fuse phenomenon, a continuous destruction of the fiber core pumped by the propagating light and triggered by a heat-induced strong absorption of silica glass. However, we have limited knowledge on its initiation process in the viewpoint of energy flow in the reactive area. Therefore, the conditions required for a fiber fuse initiation in standard single-mode fibers were determined quantitatively, namely the power of a 1480 nm fiber laser and the arc discharge intensity provided by a fusion splicer for one second as an outer heat source. Systematic investigation on the energy flow balance between these energy sources revealed that the initiation process consists of two steps; the generation of a precursor at the heated spot and the transition to a stable fiber fuse. The latter step needs a certain degree of heat accumulation at the core where waveguide deformation is ongoing competitively. This method is useful for comparing the tolerance to fiber fuse initiation among various fibers with a fixed energy amount that was not noticed before. PMID:27140935

  3. Quantitative evaluation of fiber fuse initiation with exposure to arc discharge provided by a fusion splicer

    PubMed Central

    Todoroki, Shin-ichi

    2016-01-01

    The optical communication industry and power-over-fiber applications face a dilemma as a result of the expanding demand of light power delivery and the potential risks of high-power light manipulation including the fiber fuse phenomenon, a continuous destruction of the fiber core pumped by the propagating light and triggered by a heat-induced strong absorption of silica glass. However, we have limited knowledge on its initiation process in the viewpoint of energy flow in the reactive area. Therefore, the conditions required for a fiber fuse initiation in standard single-mode fibers were determined quantitatively, namely the power of a 1480 nm fiber laser and the arc discharge intensity provided by a fusion splicer for one second as an outer heat source. Systematic investigation on the energy flow balance between these energy sources revealed that the initiation process consists of two steps; the generation of a precursor at the heated spot and the transition to a stable fiber fuse. The latter step needs a certain degree of heat accumulation at the core where waveguide deformation is ongoing competitively. This method is useful for comparing the tolerance to fiber fuse initiation among various fibers with a fixed energy amount that was not noticed before. PMID:27140935

  4. Risk evaluation and mitigation strategies (REMS): educating the prescriber.

    PubMed

    Nicholson, Susan C; Peterson, Janet; Yektashenas, Behin

    2012-02-01

    The US FDA Amendments Act of 2007 was signed into law on 27 September 2007. A provision of this law granted the FDA new powers to enhance drug safety by requiring the pharmaceutical industry to develop Risk Evaluation and Mitigation Strategies (REMS). REMS are deemed necessary when a question exists as to whether the benefits of a drug outweigh its risks. REMS constitute a safety plan with several potential components, including a medication guide, a communication plan, elements to ensure safe use and an implementation system to help guide the prescribers, pharmacists and patients. This applies to existing drugs on the market, new drug applications (NDAs), abbreviated NDAs (generics) and biologics licence applications. REMS represent an 'upgrade' from previously required risk minimization action plans, based on the strengthening of FDA powers of authority and enforceability to incur monetary penalties against individuals representing the pharmaceutical industry who fail to comply. For illustrative purposes, we chose the drug romiplostim (Nplate®) to present an REMS, as all components were utilized to help assuage risks associated with the drug. Romiplostim is an FDA-approved drug used to treat thrombocytopenia in patients with chronic immune (idiopathic) thrombocytopenic purpura that has a significant adverse safety profile based on the risk of changes in bone marrow reticulin formation and bone marrow fibroses, and other associated risks. This review of current REMS policy is intended to provide the prescriber with a better understanding of current modalities in FDA-mandated drug safety programmes, which will impact day-to-day healthcare provider practices. PMID:22171604

  5. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way. PMID:22953929

  6. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-08-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve 10-day to 5-month sea ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett ice severity index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  7. Exploring the utility of quantitative network design in evaluating Arctic sea-ice thickness sampling strategies

    NASA Astrophysics Data System (ADS)

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-03-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, an area particularly relevant for maritime traffic and offshore resource exploration, as well as two areas related to the Barnett Ice Severity Index (BSI), a standard measure of shipping conditions along the Alaskan coast that is routinely issued by ice services. Our analysis quantifies the benefits of sampling upstream of the target area and of reducing the sampling uncertainty. We demonstrate how observations of sea-ice and snow thickness can constrain ice and snow variables in a target region and quantify the complementarity of combining two flight transects. We further quantify the benefit of improved atmospheric forecasts and a well-calibrated model.

  8. Quantitative Evaluation of Peptide-Material Interactions by a Force Mapping Method: Guidelines for Surface Modification.

    PubMed

    Mochizuki, Masahito; Oguchi, Masahiro; Kim, Seong-Oh; Jackman, Joshua A; Ogawa, Tetsu; Lkhamsuren, Ganchimeg; Cho, Nam-Joon; Hayashi, Tomohiro

    2015-07-28

    Peptide coatings on material surfaces have demonstrated wide application across materials science and biotechnology, facilitating the development of nanobio interfaces through surface modification. A guiding motivation in the field is to engineer peptides with a high and selective binding affinity to target materials. Herein, we introduce a quantitative force mapping method in order to evaluate the binding affinity of peptides to various hydrophilic oxide materials by atomic force microscopy (AFM). Statistical analysis of adhesion forces and probabilities obtained on substrates with a materials contrast enabled us to simultaneously compare the peptide binding affinity to different materials. On the basis of the experimental results and corresponding theoretical analysis, we discuss the role of various interfacial forces in modulating the strength of peptide attachment to hydrophilic oxide solid supports as well as to gold. The results emphasize the precision and robustness of our approach to evaluating the adhesion strength of peptides to solid supports, thereby offering guidelines to improve the design and fabrication of peptide-coated materials. PMID:26125092

  9. A quantitative and standardized robotic method for the evaluation of arm proprioception after stroke.

    PubMed

    Simo, Lucia S; Ghez, Claude; Botzer, Lior; Scheidt, Robert A

    2011-01-01

    Stroke often results in both motor and sensory deficits, which may interact in the manifested functional impairment. Proprioception is known to play important roles in the planning and control of limb posture and movement; however, the impact of proprioceptive deficits on motor function has been difficult to elucidate due in part to the qualitative nature of available clinical tests. We present a quantitative and standardized method for evaluating proprioception in tasks directly relevant to those used to assess motor function. Using a robotic manipulandum that exerted controlled displacements of the hand, stroke participants were evaluated, and compared with a control group, in their ability to detect such displacements in a 2-alternative, forced-choice paradigm. A psychometric function parameterized the decision process underlying the detection of the hand displacements. The shape of this function was determined by a signal detection threshold and by the variability of the response about this threshold. Our automatic procedure differentiates between participants with and without proprioceptive deficits and quantifies functional proprioceptive sensation on a magnitude scale that is meaningful for ongoing studies of degraded motor function in comparable horizontal movements. PMID:22256252

  10. Quantitative evaluation of a low-cost noninvasive hybrid interface based on EEG and eye movement.

    PubMed

    Kim, Minho; Kim, Byung Hyung; Jo, Sungho

    2015-03-01

    This paper describes a low-cost noninvasive brain-computer interface (BCI) hybridized with eye tracking. It also discusses its feasibility through a Fitts' law-based quantitative evaluation method. Noninvasive BCI has recently received a lot of attention. To bring the BCI applications into real life, user-friendly and easily portable devices need to be provided. In this work, as an approach to realize a real-world BCI, electroencephalograph (EEG)-based BCI combined with eye tracking is investigated. The two interfaces can be complementary to attain improved performance. Especially to consider public availability, a low-cost interface device is intentionally used for test. A low-cost commercial EEG recording device is integrated with an inexpensive custom-built eye tracker. The developed hybrid interface is evaluated through target pointing and selection experiments. Eye movement is interpreted as cursor movement and noninvasive BCI selects a cursor point with two selection confirmation schemes. Using Fitts' law, the proposed interface scheme is compared with other interface schemes such as mouse, eye tracking with dwell time, and eye tracking with keyboard. In addition, the proposed hybrid BCI system is discussed with respect to a practical interface scheme. Although further advancement is required, the proposed hybrid BCI system has the potential to be practically useful in a natural and intuitive manner. PMID:25376041

  11. Laboratory design and test procedures for quantitative evaluation of infrared sensors to assess thermal anomalies

    SciTech Connect

    Chang, Y.M.; Grot, R.A.; Wood, J.T.

    1985-06-01

    This report presents the description of the laboratory apparatus and preliminary results of the quantitative evaluation of three high-resolution and two low-resolution infrared imaging systems. These systems which are commonly used for building diagnostics are tested under various background temperatures (from -20/sup 0/C to 25/sup 0/C) for their minimum resolvable temperature differences (MRTD) at spatial frequencies from 0.03 to 0.25 cycles per milliradian. The calibration curves of absolute and differential temperature measurements are obtained for three systems. The signal transfer function and line spread function at ambient temperature of another three systems are also measured. Comparisons of the dependence of the MRTD on background temperatures from the measured data with the predicted values given in ASHRAE Standards 101-83 are also included. The dependence of background temperatures for absolute temperature measurements are presented, as well as comparison of measured data and data given by the manufacturer. Horizontal on-axis magnification factors of the geometric transfer function of two systems are also established to calibrate the horizontal axis for the measured line spread function to obtain the modulation transfer function. The variation of the uniformity for horizontal display of these two sensors are also observed. Included are detailed descriptions of laboratory design, equipment setup, and evaluation procedures of each test. 10 refs., 38 figs., 12 tabs.

  12. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  13. Evaluation of Reference Genes for Quantitative Real-Time PCR in Songbirds

    PubMed Central

    Zinzow-Kramer, Wendy M.; Horton, Brent M.; Maney, Donna L.

    2014-01-01

    Quantitative real-time PCR (qPCR) is becoming a popular tool for the quantification of gene expression in the brain and endocrine tissues of songbirds. Accurate analysis of qPCR data relies on the selection of appropriate reference genes for normalization, yet few papers on songbirds contain evidence of reference gene validation. Here, we evaluated the expression of ten potential reference genes (18S, ACTB, GAPDH, HMBS, HPRT, PPIA, RPL4, RPL32, TFRC, and UBC) in brain, pituitary, ovary, and testis in two species of songbird: zebra finch and white-throated sparrow. We used two algorithms, geNorm and NormFinder, to assess the stability of these reference genes in our samples. We found that the suitability of some of the most popular reference genes for target gene normalization in mammals, such as 18S, depended highly on tissue type. Thus, they are not the best choices for brain and gonad in these songbirds. In contrast, we identified alternative genes, such as HPRT, RPL4 and PPIA, that were highly stable in brain, pituitary, and gonad in these species. Our results suggest that the validation of reference genes in mammals does not necessarily extrapolate to other taxonomic groups. For researchers wishing to identify and evaluate suitable reference genes for qPCR songbirds, our results should serve as a starting point and should help increase the power and utility of songbird models in behavioral neuroendocrinology. PMID:24780145

  14. Quantitative Evaluation of Iranian Radiology Papers and Its Comparison with Selected Countries

    PubMed Central

    Ghafoori, Mahyar; Emami, Hasan; Sedaghat, Abdolrasoul; Ghiasi, Mohammad; Shakiba, Madjid; Alavi, Manijeh

    2014-01-01

    Background: Recent technological developments in medicine, including modern radiology have promoted the impact of scientific researches on social life. The scientific outputs such as article and patents are products that show the scientists’ attempt to access these achievements. Objectives: In the current study, we evaluate the current situation of Iranian scientists in the field of radiology and compare it with the selected countries in terms of scientific papers. For this purpose, we used scientometric tools to quantitatively assess the scientific papers in the field of radiology. Materials and Methods: Radiology papers were evaluated in the context of medical field audit using retrospective model. We used the related databases of biomedical sciences for extraction of articles related to radiology. In the next step, the situation of radiology scientific products of the country were determined with respect to the under study regional countries. Results: Results of the current study showed a ratio of 0.19% for Iranian papers in PubMed database published in 2009. In addition, in 2009, Iranian papers constituted 0.29% of the Scopus scientific database. The proportion of Iranian papers in the understudy region was 7.6%. Conclusion: To diminish the gap between Iranian scientific radiology papers and other competitor countries in the region and achievement of document 2025 goals, multifold effort of the society of radiology is necessary. PMID:24693301

  15. Quantitative evaluation of reactive nitrogen emissions with urbanization: a case study in Beijing megacity, China.

    PubMed

    Xian, Chaofan; Ouyang, Zhiyun; Lu, Fei; Xiao, Yang; Li, Yanmin

    2016-09-01

    The rapid increase in anthropogenic nitrogen (N) load in urbanized environment threatens urban sustainability. In this study, we estimated the amount of reactive N (Nr) as an index of N pollution potential caused by human activities, using the megacity of Beijing as a case study. We investigated the temporal changes in Nr emissions in the environment from 2000 to 2012 using a multidisciplinary approach with quantitative evaluation. The Nr emissions presented slightly increasing during study period, and the annual emission was 0.19 Tg N, mainly resulting from fuel combustion. Nevertheless, the Nr output intensity resulting from inhabitants' livelihoods and material production had weakened over the study period. The evaluation results showed that the environmental measures to remove Nr in Beijing were efficient in most years, suggesting that progress in mitigating the growth of the Nr load in this urban environment was significant. Further measures based on N offset are suggested that could help alleviate the environmental pressure resulting from anthropogenic Nr emissions. These could provide theoretical support for the sustainable development of megacities. PMID:27240830

  16. Quantitative evaluation of image-based distortion correction in diffusion tensor imaging.

    PubMed

    Netsch, Thomas; van Muiswinkel, Arianne

    2004-07-01

    A statistical method for the evaluation of image registration for a series of images based on the assessment of consistency properties of the registration results is proposed. Consistency is defined as the residual error of the composition of cyclic registrations. By combining the transformations of different algorithms the consistency error allows a quantitative comparison without the use of ground truth, specifically, it allows a determination as to whether the algorithms are compatible and hence provide comparable registrations. Consistency testing is applied to evaluate retrospective correction of eddy current-induced image distortion in diffusion tensor imaging of the brain. In the literature several image transformations and similarity measures have been proposed, generally showing a significant reduction of distortion in side-by-side comparison of parametric maps before and after registration. Transformations derived from imaging physics and a three-dimensional affine transformation as well as mutual information (MI) and local correlation (LC) similarity are compared to each other by means of consistency testing. The dedicated transformations could not demonstrate a significant difference for more than half of the series considered. LC similarity is well-suited for distortion correction providing more consistent registrations which are comparable to MI. PMID:15250631

  17. Helicobacter pylori Infection and Risk of Gastric Cancer in Korea: A Quantitative Systematic Review

    PubMed Central

    2016-01-01

    Objectives: In the context of the global decrease in mortality due to gastric cancer, previous studies have reported that the effect of chronic Helicobacter pylori (H. pylori) infection on the incidence of gastric cancer varies among regions. This systematic review was conducted to investigate H. pylori as a risk factor for gastric cancer in Korea, where the incidence of gastric cancer is among the highest in the world. Methods: A search strategy was established to identify articles published in Korean as well as in English. Ultimately, we included observational studies conducted among Korean patients that designed with an age-matched and sex-matched control group that reported the odds ratio associated with H. pylori. Gastric cancer cases were subdivided into overall (OGC), cardia (CGC), non-cardia (NGC), early (EGC), advanced, intestinal (IGC), and diffuse forms of gastric cancer. Summary odds ratios (SORs) with 95% confidence intervals (CIs) were calculated in the meta-analysis using a random-effect model. Results: Eleven case-control studies were ultimately selected. H. pylori was associated with an SOR of 1.81 (95% CI, 1.29 to 2.54) for OGC. Additionally, statistically significant risks were observed for CGC, NGC, EGC, and IGC. Conclusions: Chronic H. pylori infection was found to raise the risk of gastric cancer among Koreans, with the highest risk observed for CGC and EGC (SOR=2.88 for both). Follow-up clinical epidemiologic studies are needed to assess the effects of current treatments aimed at eradicating H. pylori infections. PMID:27499162

  18. A quantitative risk assessment for metals in surface water following the application of biosolids to grassland.

    PubMed

    Clarke, Rachel; Peyton, Dara; Healy, Mark G; Fenton, Owen; Cummins, Enda

    2016-10-01

    During episodic rainfall events, land application of treated municipal sludge ('biosolids') may give rise to surface runoff of metals, which may be potentially harmful to human health if not fully treated in a water treatment plant (WTP). This study used surface runoff water quality data generated from a field-scale study in which three types of biosolids (anaerobically digested (AD), lime stabilised (LS), and thermally dried (TD)) were spread on micro-plots of land and subjected to three rainfall events at time intervals of 24, 48 and 360h following application. Making the assumption that this water directly entered abstraction waters for a WTP without any grassed buffer zone being present, accounting for stream dilution, and modelling various performance scenarios within the WTP, the aim of this research was to conduct a human health risk assessment of metals (Cu, Ni, Pb, Zn, Cd and Cr), which may still be present in drinking water after the WTP. Different dose-response relationships were characterised for the different metals with reference to the lifetime average daily dose (LADD) and the Hazard Quotient (HQ). The results for the LADD show that child exposure concentrations were highest for Cu when the measured surface runoff concentrations from the LS biosolids treatment were used as input into the model. The results for the HQ showed that of all the scenarios considered, Cu had the highest HQ for children. However, values were below the threshold value of risk (HQ<0.01 - no existing risk). Under the conditions monitored, metal concentrations in the biosolids applied to grassland were not considered to result in a risk to human health in surface water systems. PMID:27213676

  19. Evaluation of Fourier Transform Profilometry for Quantitative Waste Volume Determination under Simulated Hanford Tank Conditions

    SciTech Connect

    Etheridge, J.A.; Jang, P.R.; Leone, T.; Long, Z.; Norton, O.P.; Okhuysen, W.P.; Monts, D.L.; Coggins, T.L.

    2008-07-01

    The Hanford Site is currently in the process of an extensive effort to empty and close its radioactive single-shell and double-shell waste storage tanks. Before this can be accomplished, it is necessary to know how much residual material is left in a given waste tank and the chemical makeup of the residue. The objective of Mississippi State University's Institute for Clean Energy Technology's (ICET) efforts is to develop, fabricate, and deploy inspection tools for the Hanford waste tanks that will (1) be remotely operable; (2) provide quantitative information on the amount of wastes remaining; and (3) provide information on the spatial distribution of chemical and radioactive species of interest. A collaborative arrangement has been established with the Hanford Site to develop probe-based inspection systems for deployment in the waste tanks. ICET is currently developing an in-tank inspection system based on Fourier Transform Profilometry, FTP. FTP is a non-contact, 3-D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, FTP is capable of determining the height (depth) distribution (and hence volume distribution) of the target surface, thus reproducing the profile of the target accurately under a wide variety of conditions. Hence FTP has the potential to be utilized for quantitative determination of residual wastes within Hanford waste tanks. We are conducting a multi-stage performance evaluation of FTP in order to document the accuracy, precision, and operator dependence (minimal) of FTP under conditions similar to those that can be expected to pertain within Hanford waste tanks. The successive stages impose aspects that present increasing difficulty and increasingly more accurate approximations of in-tank environments. In this paper, we report our investigations of the dependence of the analyst upon FTP volume determination results and of the

  20. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    PubMed Central

    SAVAS, Selcuk; KAVRÌK, Fevzi; KUCUKYÌLMAZ, Ebru

    2016-01-01

    ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Material and Methods Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p<0.05), the difference between the 1- and 4-week was not significant (p>0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05). After the 1- and 4-week treatment periods, the calcium (Ca) and phosphate (P) concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05). Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use. PMID:27383699

  1. Simulation-based evaluation of the resolution and quantitative accuracy of temperature-modulated fluorescence tomography

    PubMed Central

    Lin, Yuting; Nouizi, Farouk; Kwong, Tiffany C.; Gulsen, Gultekin

    2016-01-01

    Conventional fluorescence tomography (FT) can recover the distribution of fluorescent agents within a highly scattering medium. However, poor spatial resolution remains its foremost limitation. Previously, we introduced a new fluorescence imaging technique termed “temperature-modulated fluorescence tomography” (TM-FT), which provides high-resolution images of fluorophore distribution. TM-FT is a multimodality technique that combines fluorescence imaging with focused ultrasound to locate thermo-sensitive fluorescence probes using a priori spatial information to drastically improve the resolution of conventional FT. In this paper, we present an extensive simulation study to evaluate the performance of the TM-FT technique on complex phantoms with multiple fluorescent targets of various sizes located at different depths. In addition, the performance of the TM-FT is tested in the presence of background fluorescence. The results obtained using our new method are systematically compared with those obtained with the conventional FT. Overall, TM-FT provides higher resolution and superior quantitative accuracy, making it an ideal candidate for in vivo preclinical and clinical imaging. For example, a 4 mm diameter inclusion positioned in the middle of a synthetic slab geometry phantom (D:40 mm × W :100 mm) is recovered as an elongated object in the conventional FT (x = 4.5 mm; y = 10.4 mm), while TM-FT recovers it successfully in both directions (x = 3.8 mm; y = 4.6 mm). As a result, the quantitative accuracy of the TM-FT is superior because it recovers the concentration of the agent with a 22% error, which is in contrast with the 83% error of the conventional FT. PMID:26368884

  2. Quantitative Evaluation of Stomatal Cytoskeletal Patterns during the Activation of Immune Signaling in Arabidopsis thaliana

    PubMed Central

    Shimono, Masaki; Higaki, Takumi; Kaku, Hanae; Shibuya, Naoto; Hasezawa, Seiichiro

    2016-01-01

    Historically viewed as primarily functioning in the regulation of gas and water vapor exchange, it is now evident that stomata serve an important role in plant immunity. Indeed, in addition to classically defined functions related to cell architecture and movement, the actin cytoskeleton has emerged as a central component of the plant immune system, underpinning not only processes related to cell shape and movement, but also receptor activation and signaling. Using high resolution quantitative imaging techniques, the temporal and spatial changes in the actin microfilament array during diurnal cycling of stomatal guard cells has revealed a highly orchestrated transition from random arrays to ordered bundled filaments. While recent studies have demonstrated that plant stomata close in response to pathogen infection, an evaluation of stimulus-induced changes in actin cytoskeletal dynamics during immune activation in the guard cell, as well as the relationship of these changes to the function of the actin cytoskeleton and stomatal aperture, remains undefined. In the current study, we employed quantitative cell imaging and hierarchical clustering analyses to define the response of the guard cell actin cytoskeleton to pathogen infection and the elicitation of immune signaling. Using this approach, we demonstrate that stomatal-localized actin filaments respond rapidly, and specifically, to both bacterial phytopathogens and purified pathogen elicitors. Notably, we demonstrate that higher order temporal and spatial changes in the filament array show distinct patterns of organization during immune activation, and that changes in the naïve diurnal oscillations of guard cell actin filaments are perturbed by pathogens, and that these changes parallel pathogen-induced stomatal gating. The data presented herein demonstrate the application of a highly tractable and quantifiable method to assign transitions in actin filament organization to the activation of immune signaling in

  3. Evaluation of residual antibacterial potency in antibiotic production wastewater using a real-time quantitative method.

    PubMed

    Zhang, Hong; Zhang, Yu; Yang, Min; Liu, Miaomiao

    2015-11-01

    While antibiotic pollution has attracted considerable attention due to its potential in promoting the dissemination of antibiotic resistance genes in the environment, the antibiotic activity of their related substances has been neglected, which may underestimate the environmental impacts of antibiotic wastewater discharge. In this study, a real-time quantitative approach was established to evaluate the residual antibacterial potency of antibiotics and related substances in antibiotic production wastewater (APW) by comparing the growth of a standard bacterial strain (Staphylococcus aureus) in tested water samples with a standard reference substance (e.g. oxytetracycline). Antibiotic equivalent quantity (EQ) was used to express antibacterial potency, which made it possible to assess the contribution of each compound to the antibiotic activity in APW. The real-time quantitative method showed better repeatability (Relative Standard Deviation, RSD 1.08%) compared with the conventional fixed growth time method (RSD 5.62-11.29%). And its quantification limits ranged from 0.20 to 24.00 μg L(-1), depending on the antibiotic. We applied the developed method to analyze the residual potency of water samples from four APW treatment systems, and confirmed a significant contribution from antibiotic transformation products to potent antibacterial activity. Specifically, neospiramycin, a major transformation product of spiramycin, was found to contribute 13.15-22.89% of residual potency in spiramycin production wastewater. In addition, some unknown related substances with antimicrobial activity were indicated in the effluent. This developed approach will be effective for the management of antibacterial potency discharge from antibiotic wastewater and other waste streams. PMID:26395288

  4. Evaluation of a quantitative fit testing method for N95 filtering facepiece respirators.

    PubMed

    Janssen, Larry; Luinenburg, Michael D; Mullins, Haskell E; Danisch, Susan G; Nelson, Thomas J

    2003-01-01

    A method for performing quantitative fit tests (QNFT) with N95 filtering facepiece respirators was developed by earlier investigators. The method employs a simple clamping device to allow the penetration of submicron aerosols through N95 filter media to be measured. The measured value is subtracted from total penetration, with the assumption that the remaining penetration represents faceseal leakage. The developers have used the clamp to assess respirator performance. This study evaluated the clamp's ability to measure filter penetration and determine fit factors. In Phase 1, subjects were quantitatively fit-tested with elastomeric half-facepiece respirators using both generated and ambient aerosols. QNFT were done with each aerosol with both P100 and N95 filters without disturbing the facepiece. In Phase 2 of the study elastomeric half facepieces were sealed to subjects' faces to eliminate faceseal leakage. Ambient aerosol QNFT were performed with P100 and N95 filters without disturbing the facepiece. In both phases the clamp was used to measure N95 filter penetration, which was then subtracted from total penetration for the N95 QNFT. It was hypothesized that N95 fit factors corrected for filter penetration would equal the P100 fit factors. Mean corrected N95 fit factors were significantly different from the P100 fit factors in each phase of the study. In addition, there was essentially no correlation between corrected N95 fit factors and P100 fit factors. It was concluded that the clamp method should not be used to fit-test N95 filtering facepieces or otherwise assess respirator performance. PMID:12908863

  5. Survey and evaluation of aging risk assessment methods and applications

    SciTech Connect

    Sanzo, D.L.; Kvam, P.; Apostolakis, G.; Wu, J.; Milici, T.; Ghoniem, N.; Guarro, S.

    1993-11-01

    The Nuclear Regulatory Commission (NRC) initiated the nuclear power plant aging research (NPAR) program about 6 years ago to gather information about nuclear power plant aging. Since then, this program has collected a significant amount of information, largely qualitative, on plant aging and its potential effects on plant safety. However, this body of knowledge has not yet been integrated into formalisms that can be used effectively and systematically to assess plant risk resulting from aging, although models for assessing the effect of increasing failure rates on core damage frequency have been proposed. The purpose of this review is to survey the work conducted to address the aging of systems, structures, and components (SSCs) of nuclear power plants (NPPs), as well as the associated data bases. The review takes a critical look at the need to revise probabilistic risk assessment (PRAs) so that they will include the contribution to risk from plant aging, the adequacy of existing methods for evaluating this contribution, and the adequacy of the data that have been used in these evaluation methods. A preliminary framework is identified for integrating the aging of SSCs into the PRA, including the identification of needed data for such an integration.

  6. Quantitative cancer risk assessment for occupational exposures to asphalt fumes during built-up roofing asphalt (BURA) operations.

    PubMed

    Rhomberg, Lorenz R; Mayfield, David B; Goodman, Julie E; Butler, Eric L; Nascarella, Marc A; Williams, Daniel R

    2015-01-01

    The International Agency for Research on Cancer qualitatively characterized occupational exposure to oxidized bitumen emissions during roofing as probably carcinogenic to humans (Group 2A). We examine chemistry, exposure, epidemiology and animal toxicity data to explore quantitative risks for roofing workers applying built-up roofing asphalt (BURA). Epidemiology studies do not consistently report elevated risks, and generally do not have sufficient exposure information or adequately control for confounders, precluding their use for dose-response analysis. Dermal carcinogenicity bioassays using mice report increased tumor incidence with single high doses. In order to quantify potential cancer risks, we develop time-to-tumor model methods [consistent with U.S. Environmental Protection Agency (EPA) dose-response analysis and mixtures guidelines] using the dose-time-response shape of concurrent exposures to benzo[a]pyrene (B[a]P) as concurrent controls (which had several exposure levels) to infer presumed parallel dose-time-response curves for BURA-fume condensate. We compare EPA relative potency factor approaches, based on observed relative potency of BURA to B[a]P in similar experiments, and direct observation of the inferred BURA dose-time-response (scaled to humans) as means for characterizing a dermal unit risk factor. We apply similar approaches to limited data on asphalt-fume inhalation and respiratory cancers in rats. We also develop a method for adjusting potency estimates for asphalts that vary in composition using measured fluorescence. Overall, the various methods indicate that cancer risks to roofers from both dermal and inhalation exposure to BURA are within a range typically deemed acceptable within regulatory frameworks. The approaches developed may be useful in assessing carcinogenic potency of other complex mixtures of polycyclic aromatic compounds. PMID:26515283

  7. Comparative measurement and quantitative risk assessment of alcohol consumption through wastewater-based epidemiology: An international study in 20 cities.

    PubMed

    Ryu, Yeonsuk; Barceló, Damià; Barron, Leon P; Bijlsma, Lubertus; Castiglioni, Sara; de Voogt, Pim; Emke, Erik; Hernández, Félix; Lai, Foon Yin; Lopes, Alvaro; de Alda, Miren López; Mastroianni, Nicola; Munro, Kelly; O'Brien, Jake; Ort, Christoph; Plósz, Benedek G; Reid, Malcolm J; Yargeau, Viviane; Thomas, Kevin V

    2016-09-15

    Quantitative measurement of drug consumption biomarkers in wastewater can provide objective information on community drug use patterns and trends. This study presents the measurement of alcohol consumption in 20 cities across 11 countries through the use of wastewater-based epidemiology (WBE), and reports the application of these data for the risk assessment of alcohol on a population scale using the margin of exposure (MOE) approach. Raw 24-h composite wastewater samples were collected over a one-week period from 20 cities following a common protocol. For each sample a specific and stable alcohol consumption biomarker, ethyl sulfate (EtS) was determined by liquid chromatography coupled to tandem mass spectrometry. The EtS concentrations were used for estimation of per capita alcohol consumption in each city, which was further compared with international reports and applied for risk assessment by MOE. The average per capita consumption in 20 cities ranged between 6.4 and 44.3L/day/1000 inhabitants. An increase in alcohol consumption during the weekend occurred in all cities, however the level of this increase was found to differ. In contrast to conventional data (sales statistics and interviews), WBE revealed geographical differences in the level and pattern of actual alcohol consumption at an inter-city level. All the sampled cities were in the "high risk" category (MOE<10) and the average MOE for the whole population studied was 2.5. These results allowed direct comparisons of alcohol consumption levels, patterns and risks among the cities. This study shows that WBE can provide timely and complementary information on alcohol use and alcohol associated risks in terms of exposure at the community level. PMID:27188267

  8. Evaluating the risk-reduction benefits of wind energy

    SciTech Connect

    Brower, M.C.; Bell, K.; Bernow, S.; Duckworth, M.; Spinney P.

    1996-12-31

    This paper presents preliminary results of a study to evaluate the risk-reduction benefits of wind power for a case study utility system using decision analysis techniques. The costs and risks of two alternative decisions-whether to build a 400 MW gas-fired combined cycle plant or a 1600 MW wind plant in 2003-were compared through computer simulations as fuel prices, environmental regulatory costs, wind and conventional power plant availability, and load growth were allowed to vary. Three different market scenarios were examined: traditional regulation, a short-term power pool, and fixed-price contracts of varying duration. The study concludes that, from the perspective of ratepayers, wind energy provides a net levelized risk-reduction benefit of $3.4 to $7.8/MWh under traditional regulation, and less in the other scenarios. From the perspective of the utility plant owners, wind provides a significant risk benefit in the unregulated market scenarios but none in a regulated market. The methodology and findings should help inform utility resource planning and industry restructuring efforts. 2 figs., 3 tabs.

  9. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF EDC RISK MANAGEMENT METHODS

    EPA Science Inventory

    In Superfund risk management research, the performance of risk management techniques is typically evaluated by measuring "the concentrations of the chemicals of concern before and after risk management efforts. However, using bioassays and chemical data provides a more robust und...

  10. Quantitative Microbial Risk Assessment of Freshwater Impacted by Animal Fecal Material

    EPA Science Inventory

    We evaluated the potential for human illness from a hypothetical recreational exposure to freshwater impacted by land-applied, agricultural animal fecal material. The hypothetical exposure scenario included the following characteristics: 1) fresh cattle manure, pig slurry, or ch...

  11. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen

  12. Adenovirus-associated health risks for recreational activities in a multi-use coastal watershed based on site-specific quantitative microbial risk assessment.

    PubMed

    Kundu, Arti; McBride, Graham; Wuertz, Stefan

    2013-10-15

    We used site-specific quantitative microbial risk assessment (QMRA) to assess the probability of adenovirus illness for three groups of swimmers: adults with primary contact, children with primary contact, and secondary contact regardless of age. Human enteroviruses and adenoviruses were monitored by qPCR in a multi-use watershed and Adenovirus type 40/41 was detected in 11% of 73 samples, ranging from 147 to 4117 genomes per liter. Enterovirus was detected only once (32 genomes per liter). Seven of eight virus detections occurred when E. coli concentrations were below the single sample maximum water quality criterion for contact recreation, and five of eight virus detections occurred when fecal coliforms were below the corresponding criterion. We employed dose-harmonization to convert viral genome measurements to TCID50 values needed for dose-response curves. The three scenarios considered different amounts of water ingestion and Monte Carlo simulation was used to account for the variability associated with the doses. The mean illness risk in children based on adenovirus measurements obtained over 11 months was estimated to be 3.5%, which is below the 3.6% risk considered tolerable by the current United States EPA recreational criteria for gastrointestinal illnesses (GI). The mean risks of GI illness for adults and secondary contact were 1.9% and 1.0%, respectively. These risks changed appreciably when different distributions were fitted to the data as determined by Monte Carlo simulations. In general, risk was at a maximum for the log-logistic distribution and lowest for the hockey stick distribution in all three selected scenarios. Also, under default assumptions, the risk was lowered considerably when assuming that only a small proportion of Adenovirus 40/41 (3%) was as infectious as Adenovirus type 4, compared to the assumption that all genomes were Adenovirus 4. In conclusion, site-specific QMRA on water-borne adenoviruses in this watershed provided a similar

  13. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies

    SciTech Connect

    Silva-Rodríguez, Jesús Aguiar, Pablo; Sánchez, Manuel; Mosquera, Javier; Luna-Vega, Víctor; Cortés, Julia; Garrido, Miguel; Pombar, Miguel; Ruibal, Álvaro

    2014-05-15

    Purpose: Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. Methods: One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manual ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Results: Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. Conclusions: The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.

  14. Neural net classification of liver ultrasonogram for quantitative evaluation of diffuse liver disease

    NASA Astrophysics Data System (ADS)

    Lee, Dong Hyuk; Kim, JongHyo; Kim, Hee C.; Lee, Yong W.; Min, Byong Goo

    1997-04-01

    There have been a number of studies on the quantitative evaluation of diffuse liver disease by using texture analysis technique. However, the previous studies have been focused on the classification between only normal and abnormal pattern based on textural properties, resulting in lack of clinically useful information about the progressive status of liver disease. Considering our collaborative research experience with clinical experts, we judged that not only texture information but also several shape properties are necessary in order to successfully classify between various states of disease with liver ultrasonogram. Nine image parameters were selected experimentally. One of these was texture parameter and others were shape parameters measured as length, area and curvature. We have developed a neural-net algorithm that classifies liver ultrasonogram into 9 categories of liver disease: 3 main category and 3 sub-steps for each. Nine parameters were collected semi- automatically from the user by using graphical user interface tool, and then processed to give a grade for each parameter. Classifying algorithm consists of two steps. At the first step, each parameter was graded into pre-defined levels using neural network. in the next step, neural network classifier determined disease status using graded nine parameters. We implemented a PC based computer-assist diagnosis workstation and installed it in radiology department of Seoul National University Hospital. Using this workstation we collected 662 cases during 6 months. Some of these were used for training and others were used for evaluating accuracy of the developed algorithm. As a conclusion, a liver ultrasonogram classifying algorithm was developed using both texture and shape parameters and neural network classifier. Preliminary results indicate that the proposed algorithm is useful for evaluation of diffuse liver disease.

  15. Quantitative evaluation of changes in gait after extended cerebrospinal fluid drainage for normal pressure hydrocephalus.

    PubMed

    Yang, Felix; Hickman, Thu-Trang; Tinl, Megan; Iracheta, Christine; Chen, Grace; Flynn, Patricia; Shuman, Matthew E; Johnson, Tatyana A; Rice, Rebecca R; Rice, Isaac M; Wiemann, Robert; Johnson, Mark D

    2016-06-01

    Idiopathic normal pressure hydrocephalus (iNPH) is characterized by gait instability, urinary incontinence and cognitive dysfunction. These symptoms can be relieved by cerebrospinal fluid (CSF) drainage, but the time course and nature of the improvements are poorly characterized. Attempts to prospectively identify iNPH patients responsive to CSF drainage by evaluating presenting gait quality or via extended lumbar cerebrospinal fluid drainage (eLCD) trials are common, but the reliability of such approaches is unclear. Here we combine eLCD trials with computerized quantitative gait measurements to predict shunt responsiveness in patients undergoing evaluation for possible iNPH. In this prospective cohort study, 50 patients presenting with enlarged cerebral ventricles and gait, urinary, and/or cognitive difficulties were evaluated for iNPH using a computerized gait analysis system during a 3day trial of eLCD. Gait speed, stride length, cadence, and the Timed Up and Go test were quantified before and during eLCD. Qualitative assessments of incontinence and cognition were obtained throughout the eLCD trial. Patients who improved after eLCD underwent ventriculoperitoneal shunt placement, and symptoms were reassessed serially over the next 3 to 15months. There was no significant difference in presenting gait characteristics between patients who improved after drainage and those who did not. Gait improvement was not observed until 2 or more days of continuous drainage in most cases. Symptoms improved after eLCD in 60% of patients, and all patients who improved after eLCD also improved after shunt placement. The degree of improvement after eLCD correlated closely with that observed after shunt placement. PMID:26775149

  16. Quantitative evaluation of orbital hybridization in carbon nanotubes under radial deformation using π-orbital axis vector

    SciTech Connect

    Ohnishi, Masato; Suzuki, Ken; Miura, Hideo

    2015-04-15

    When a radial strain is applied to a carbon nanotube (CNT), the increase in local curvature induces orbital hybridization. The effect of the curvature-induced orbital hybridization on the electronic properties of CNTs, however, has not been evaluated quantitatively. In this study, the strength of orbital hybridization in CNTs under homogeneous radial strain was evaluated quantitatively. Our analyses revealed the detailed procedure of the change in electronic structure of CNTs. In addition, the dihedral angle, the angle between π-orbital axis vectors of adjacent atoms, was found to effectively predict the strength of local orbital hybridization in deformed CNTs.

  17. Hepatotoxic potential of asarones: in vitro evaluation of hepatotoxicity and quantitative determination in herbal products

    PubMed Central

    Patel, Dhavalkumar N.; Ho, Han K.; Tan, Liesbet L.; Tan, Mui-Mui B.; Zhang, Qian; Low, Min-Yong; Chan, Cheng-Leng; Koh, Hwee-Ling

    2015-01-01

    α and β asarones are natural constituents of some aromatic plants, especially species of the genus Acorus (Araceae). In addition to reports of beneficial properties of asarones, genotoxicity and carcinogenicity are also reported. Due to potential toxic effects of β-asarone, a limit of exposure from herbal products of ~2 μg/kg body weight/day has been set temporarily until a full benefit/risk assessment has been carried out by the European Medicines Agency. Therefore, it is important to monitor levels of β-asarone in herbal products. In this study, we developed a simple, rapid and validated GC-MS method for quantitative determination of asarones and applied it in 20 pediatric herbal products after detecting high concentrations of β-asarone in a product suspected to be implicated in hepatotoxicity in a 3 month old infant. Furthermore, targeted toxicological effects were further investigated in human hepatocytes (THLE-2 cells) by employing various in vitro assays, with the goal of elucidating possible mechanisms for the observed toxicity. Results showed that some of the products contained as much as 4–25 times greater amounts of β-asarone than the recommended levels. In 4 of 10 samples found to contain asarones, the presence of asarones could not be linked to the labeled ingredients, possibly due to poor quality control. Cell-based investigations in THLE-2 cells confirmed the cytotoxicity of β-asarone (IC50 = 40.0 ± 2.0 μg/mL) which was associated with significant lipid peroxidation and glutathione depletion. This observed cytotoxic effect is likely due to induction of oxidative stress by asarones. Overall, the results of this study ascertained the usability of this GC-MS method for the quantitative determination of asarones from herbal products, and shed light on the importance of controlling the concentration of potentially toxic asarones in herbal products to safeguard consumer safety, especially when the target consumers are young children. Further

  18. Nanoparticle risk management and cost evaluation: a general framework

    NASA Astrophysics Data System (ADS)

    Fleury, Dominique; Bomfim, João A. S.; Metz, Sébastien; Bouillard, Jacques X.; Brignon, Jean-Marc

    2011-07-01

    Industrial production of nano-objects has been growing fast during the last decade and a wide range of products containing nanoparticles (NPs) is proposed to the public in various markets (automotive, electronics, textiles...). The issues encountered in monitoring the presence of nano-objects in any media cause a major difficulty for controlling the risk associated to the production stage. It is therefore very difficult to assess the efficiency of prevention and mitigation solutions, which potentially leads to overestimate the level of the protection barriers that are recommended. The extra costs in adding nano-objects to the process, especially that of nanosafety, must be estimated and optimized to ensure the competitiveness of the future production lines and associated products. The risk management and cost evaluation methods presented herein have been designed for application in a pilot production line of injection-moulded nanocomposites.

  19. Dosimetry modeling of inhaled formaldehyde: binning nasal flux predictions for quantitative risk assessment.

    PubMed

    Kimbell, J S; Overton, J H; Subramaniam, R P; Schlosser, P M; Morgan, K T; Conolly, R B; Miller, F J

    2001-11-01

    Interspecies extrapolations of tissue dose and tumor response have been a significant source of uncertainty in formaldehyde cancer risk assessment. The ability to account for species-specific variation of dose within the nasal passages would reduce this uncertainty. Three-dimensional, anatomically realistic, computational fluid dynamics (CFD) models of nasal airflow and formaldehyde gas transport in the F344 rat, rhesus monkey, and human were used to predict local patterns of wall mass flux (pmol/[mm(2)-h-ppm]). The nasal surface of each species was partitioned by flux into smaller regions (flux bins), each characterized by surface area and an average flux value. Rat and monkey flux bins were predicted for steady-state inspiratory airflow rates corresponding to the estimated minute volume for each species. Human flux bins were predicted for steady-state inspiratory airflow at 7.4, 15, 18, 25.8, 31.8, and 37 l/min and were extrapolated to 46 and 50 l/min. Flux values higher than half the maximum flux value (flux median) were predicted for nearly 20% of human nasal surfaces at 15 l/min, whereas only 5% of rat and less than 1% of monkey nasal surfaces were associated with fluxes higher than flux medians at 0.576 l/min and 4.8 l/min, respectively. Human nasal flux patterns shifted distally and uptake percentage decreased as inspiratory flow rate increased. Flux binning captures anatomical effects on flux and is thereby a basis for describing the effects of anatomy and airflow on local tissue disposition and distributions of tissue response. Formaldehyde risk models that incorporate flux binning derived from anatomically realistic CFD models will have significantly reduced uncertainty compared with risk estimates based on default methods. PMID:11606807

  20. Caramel color in soft drinks and exposure to 4-methylimidazole: a quantitative risk assessment.

    PubMed

    Smith, Tyler J S; Wolfson, Julia A; Jiao, Ding; Crupain, Michael J; Rangan, Urvashi; Sapkota, Amir; Bleich, Sara N; Nachman, Keeve E

    2015-01-01

    Caramel color is added to many widely-consumed beverages as a colorant. Consumers of these beverages can be exposed to 4-methylimidazole (4-MEI), a potential carcinogen formed during its manufacture. California's Proposition 65 law requires that beverages containing 4-MEI concentrations corresponding to exposures that pose excess cancer risks > 1 case per 100,000 exposed persons (29 μg 4-MEI/day) carry warning labels. Using ultrahigh-performance liquid chromatography-tandem mass spectrometry, we assessed 4-MEI concentrations in 12 beverages purchased in California and a geographically distant metropolitan area (New York) in which warning labels are not required. In addition, we characterized beverage consumption by age and race/ethnicity (using weighted means calculated from logistic regressions) and assessed 4-MEI exposure and resulting cancer risks and US population cancer burdens attributable to beverage consumption. Data on beverage consumption were obtained from the National Health and Nutrition Examination Survey, dose-response data for 4-MEI were obtained from the California Environmental Protection Agency Office of Environmental Health Hazards Assessment, and data on population characteristics were obtained from the U.S. Census Bureau. Of the 12 beverages, Malta Goya had the highest 4-MEI concentration (915.8 to 963.3μg/L), lifetime average daily dose (LADD - 8.04x10-3 mg/kgBW-day), lifetime excess cancer risk (1.93x10-4) and burden (5,011 cancer cases in the U.S. population over 70 years); Coca-Cola had the lowest value of each (4-MEI: 9.5 to 11.7μg/L; LADD: 1.01x10-4 mg/kgBW-day; risk: 1.92x10-6; and burden: 76 cases). 4-MEI concentrations varied considerably by soda and state/area of purchase, but were generally consistent across lots of the same beverage purchased in the same state/area. Routine consumption of certain beverages can result in 4-MEI exposures > 29 μg/day. State regulatory standards appear to have been effective in reducing exposure to

  1. Caramel Color in Soft Drinks and Exposure to 4-Methylimidazole: A Quantitative Risk Assessment

    PubMed Central

    Smith, Tyler J. S.; Wolfson, Julia A.; Jiao, Ding; Crupain, Michael J.; Rangan, Urvashi; Sapkota, Amir; Bleich, Sara N.; Nachman, Keeve E.

    2015-01-01

    Caramel color is added to many widely-consumed beverages as a colorant. Consumers of these beverages can be exposed to 4-methylimidazole (4-MEI), a potential carcinogen formed during its manufacture. California’s Proposition 65 law requires that beverages containing 4-MEI concentrations corresponding to exposures that pose excess cancer risks > 1 case per 100,000 exposed persons (29 μg 4-MEI/day) carry warning labels. Using ultrahigh-performance liquid chromatography-tandem mass spectrometry, we assessed 4-MEI concentrations in 12 beverages purchased in California and a geographically distant metropolitan area (New York) in which warning labels are not required. In addition, we characterized beverage consumption by age and race/ethnicity (using weighted means calculated from logistic regressions) and assessed 4-MEI exposure and resulting cancer risks and US population cancer burdens attributable to beverage consumption. Data on beverage consumption were obtained from the National Health and Nutrition Examination Survey, dose-response data for 4-MEI were obtained from the California Environmental Protection Agency Office of Environmental Health Hazards Assessment, and data on population characteristics were obtained from the U.S. Census Bureau. Of the 12 beverages, Malta Goya had the highest 4-MEI concentration (915.8 to 963.3μg/L), lifetime average daily dose (LADD - 8.04x10-3 mg/kgBW-day), lifetime excess cancer risk (1.93x10-4) and burden (5,011 cancer cases in the U.S. population over 70 years); Coca-Cola had the lowest value of each (4-MEI: 9.5 to 11.7μg/L; LADD: 1.01x10-4 mg/kgBW-day; risk: 1.92x10-6; and burden: 76 cases). 4-MEI concentrations varied considerably by soda and state/area of purchase, but were generally consistent across lots of the same beverage purchased in the same state/area. Routine consumption of certain beverages can result in 4-MEI exposures > 29 μg/day. State regulatory standards appear to have been effective in reducing exposure

  2. Food-chain contamination evaluations in ecological risk assessments

    SciTech Connect

    Linder, G.

    1994-12-31

    Food-chain models have become increasingly important within the ecological risk assessment process. This is the case particularly when acute effects are not readily apparent, or the contaminants of concern are not readily detoxified, have a high likelihood for partitioning into lipids, or have specific target organs or tissues that may increase their significance in evaluating their potential adverse effects. An overview of food-chain models -- conceptual, theoretical, and empirical -- will be considered through a series of papers that will focus on their application within the ecological risk assessment process. Whether a food-chain evaluation is being developed to address relatively simple questions related to chronic effects of toxicants on target populations, or whether a more complex food-web model is being developed to address questions related to multiple-trophic level transfers of toxicants, the elements within the food chain contamination evaluation can be generalized to address the mechanisms of toxicant accumulation in individual organisms. This can then be incorporated into more elaborate models that consider these organismal-level processes within the context of a species life-history or community-level responses that may be associated with long-term exposures.