Sample records for quantitative risk analysis

  1. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  2. Quantitative Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helms, J.

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less

  3. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  4. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  5. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  6. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  8. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  9. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  10. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less

  11. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  12. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  13. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  14. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  15. Quantitative influence of risk factors on blood glucose level.

    PubMed

    Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu

    2014-01-01

    The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.

  16. Development of quantitative risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesmeyer, J. M.; Okrent, D.

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  17. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  18. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  19. Quantitative Risks

    DTIC Science & Technology

    2015-02-24

    Quantitative Risks Technical Report SERC -2015-TR-040-4 February 24, 2015 Principal Investigator: Dr. Gary Witus, Wayne State...0007, RT 107 Report No. SERC -2015-TR-040-4 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of...Research Center ( SERC ) is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology. This material is

  20. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk

  1. Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi-Directional Road Tunnel.

    PubMed

    Caliendo, Ciro; De Guglielmo, Maria Luisa

    2017-01-01

    A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs. © 2016 Society for Risk Analysis.

  2. IWGT report on quantitative approaches to genotoxicity risk ...

    EPA Pesticide Factsheets

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast

  3. Risk Factors for Chronic Subdural Hematoma Recurrence Identified Using Quantitative Computed Tomography Analysis of Hematoma Volume and Density.

    PubMed

    Stavrinou, Pantelis; Katsigiannis, Sotirios; Lee, Jong Hun; Hamisch, Christina; Krischek, Boris; Mpotsaris, Anastasios; Timmer, Marco; Goldbrunner, Roland

    2017-03-01

    Chronic subdural hematoma (CSDH), a common condition in elderly patients, presents a therapeutic challenge with recurrence rates of 33%. We aimed to identify specific prognostic factors for recurrence using quantitative analysis of hematoma volume and density. We retrospectively reviewed radiographic and clinical data of 227 CSDHs in 195 consecutive patients who underwent evacuation of the hematoma through a single burr hole, 2 burr holes, or a mini-craniotomy. To examine the relationship between hematoma recurrence and various clinical, radiologic, and surgical factors, we used quantitative image-based analysis to measure the hematoma and trapped air volumes and the hematoma densities. Recurrence of CSDH occurred in 35 patients (17.9%). Multivariate logistic regression analysis revealed that the percentage of hematoma drained and postoperative CSDH density were independent risk factors for recurrence. All 3 evacuation methods were equally effective in draining the hematoma (71.7% vs. 73.7% vs. 71.9%) without observable differences in postoperative air volume captured in the subdural space. Quantitative image analysis provided evidence that percentage of hematoma drained and postoperative CSDH density are independent prognostic factors for subdural hematoma recurrence. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    , four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35˚ -40˚ to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in

  5. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  6. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  7. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  8. Safety analysis, risk assessment, and risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamali, K.; Stack, D.W.; Sullivan, L.H.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less

  9. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  10. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  11. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  12. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  13. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  14. Use of mechanistic simulations as a quantitative risk-ranking tool within the quality by design framework.

    PubMed

    Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G

    2014-11-20

    The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.

  15. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  16. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    PubMed

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  17. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  18. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  19. Quantitative risk-benefit analysis of fish consumption for women of child-bearing age in Hong Kong.

    PubMed

    Chen, M Y Y; Wong, W W K; Chung, S W C; Tran, C H; Chan, B T P; Ho, Y Y; Xiao, Y

    2014-01-01

    Maternal fish consumption is associated with both risks from methylmercury (MeHg) and beneficial effects from omega-3 fatty acids to the developing foetal brain. This paper assessed the dietary exposure to MeHg of women of child-bearing age (20-49 years) in Hong Kong, and conducted risk-benefit analysis in terms of the effects in children's intelligent quotient (IQ) based on local data and the quantitative method derived by the expert consultation of FAO/WHO. Results showed that average and high consumers consume 450 and 1500 g of fish (including seafood) per week, respectively. About 11% of women of child-bearing age had a dietary exposure to MeHg exceeding the PTWI of 1.6 µg kg(-1) bw. In pregnant women MeHg intake may pose health risks to the developing foetuses. For average consumers, eating any of the 19 types of the most commonly consumed fish and seafood during pregnancy would result in 0.79-5.7 IQ points gain by their children. For high consumers, if they only ate tuna during pregnancy, it would cause 2.3 IQ points reduction in their children. The results indicated that for pregnant women the benefit outweighed the risk associated with eating fish if they consume different varieties of fish in moderation.

  20. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  1. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  2. Dispersal of Invasive Forest Insects via Recreational Firewood: A Quantitative Analysis

    Treesearch

    Frank H. Koch; Denys Yemshanov; Roger D. Magarey; William D. Smith

    2012-01-01

    Recreational travel is a recognized vector for the spread of invasive species in North America. However, there has been little quantitative analysis of the risks posed by such travel and the associated transport of firewood. In this study, we analyzed the risk of forest insect spread with firewood and estimated related dispersal parameters for application in...

  3. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  4. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    PubMed

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was

  6. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  7. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  8. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  9. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  10. 17 CFR 229.305 - (Item 305) Quantitative and qualitative disclosures about market risk.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false (Item 305) Quantitative and... Information § 229.305 (Item 305) Quantitative and qualitative disclosures about market risk. (a) Quantitative information about market risk. (1) Registrants shall provide, in their reporting currency, quantitative...

  11. Methods of quantitative risk assessment: The case of the propellant supply system

    NASA Astrophysics Data System (ADS)

    Merz, H. A.; Bienz, A.

    1984-08-01

    As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.

  12. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  13. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  14. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  15. Quantitative analysis of visible surface defect risk in tablets during film coating using terahertz pulsed imaging.

    PubMed

    Niwa, Masahiro; Hiraishi, Yasuhiro

    2014-01-30

    Tablets are the most common form of solid oral dosage produced by pharmaceutical industries. There are several challenges to successful and consistent tablet manufacturing. One well-known quality issue is visible surface defects, which generally occur due to insufficient physical strength, causing breakage or abrasion during processing, packaging, or shipping. Techniques that allow quantitative evaluation of surface strength and the risk of surface defect would greatly aid in quality control. Here terahertz pulsed imaging (TPI) was employed to evaluate the surface properties of core tablets with visible surface defects of varying severity after film coating. Other analytical methods, such as tensile strength measurements, friability testing, and scanning electron microscopy (SEM), were used to validate TPI results. Tensile strength and friability provided no information on visible surface defect risk, whereas the TPI-derived unique parameter terahertz electric field peak strength (TEFPS) provided spatial distribution of surface density/roughness information on core tablets, which helped in estimating tablet abrasion risk prior to film coating and predicting the location of the defects. TPI also revealed the relationship between surface strength and blending condition and is a nondestructive, quantitative approach to aid formulation development and quality control that can reduce visible surface defect risk in tablets. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  17. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  18. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  19. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  20. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  3. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    PubMed

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  5. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk.

    PubMed

    Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan

    2013-01-04

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.

  6. Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk

    PubMed Central

    2013-01-01

    The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality. PMID:23369514

  7. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  8. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and

  9. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  10. A Quantitative Risk-Benefit Analysis of Prophylactic Surgery Prior to Extended-Duration Spaceflight

    NASA Technical Reports Server (NTRS)

    Carroll, Danielle; Reyes, David; Kerstman, Eric; Walton, Marlei; Antonsen, Erik

    2017-01-01

    INTRODUCTION: Among otherwise healthy astronauts undertaking deep space missions, the risks for acute appendicitis (AA) and cholecystitis (AC) are not zero. If these conditions were to occur during spaceflight they may require surgery for definitive care. The proposed study quantifies and compares the risks of developing de novo AA and AC in-flight to the surgical risks of prophylactic laparoscopic appendectomy (LA) and cholecystectomy (LC) using NASA's Integrated Medical Model (IMM). METHODS: The IMM is a Monte Carlo simulation that forecasts medical events during spaceflight missions and estimates the impact of these medical events on crew health. In this study, four Design Reference Missions (DRMs) were created to assess the probability of an astronaut developing in-flight small-bowel obstruction (SBO) following prophylactic 1) LA, 2) LC, 3) LA and LC, or 4) neither surgery (SR# S-20160407-351). Model inputs were drawn from a large, population-based 2011 Swedish study that examined the incidence and risks of post-operative SBO over a 5-year follow-up period. The study group included 1,152 patients who underwent LA, and 16,371 who underwent LC. RESULTS: Preliminary results indicate that prophylactic LA may yield higher mission risks than the control DRM. Complete analyses are pending and will be subsequently available. DISCUSSION: The risk versus benefits of prophylactic surgery in astronauts to decrease the probability of acute surgical events during spaceflight has only been qualitatively examined in prior studies. Within the assumptions and limitations of the IMM, this work provides the first quantitative guidance that has previously been lacking to this important question for future deep space exploration missions.

  11. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  12. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  13. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  14. Quantitative analysis of factors that affect oil pipeline network accident based on Bayesian networks: A case study in China

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan

    2018-06-01

    Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.

  15. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  16. COMPARATIVE ANALYSIS OF HEALTH RISK ASSESSMENTS FOR MUNICIPAL WASTE COMBUSTORS

    EPA Science Inventory

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. his article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most co...

  17. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ...] Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review AGENCY: Food and Drug... availability of a draft report entitled ``Quantitative Summary of the Benefits and Risks of Prescription Drugs... ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review.'' A literature...

  18. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    PubMed

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  19. Quantitative Risk Mapping of Urban Gas Pipeline Networks Using GIS

    NASA Astrophysics Data System (ADS)

    Azari, P.; Karimi, M.

    2017-09-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  20. Quantitative risk stratification in Markov chains with limiting conditional distributions.

    PubMed

    Chan, David C; Pollett, Philip K; Weinstein, Milton C

    2009-01-01

    Many clinical decisions require patient risk stratification. The authors introduce the concept of limiting conditional distributions, which describe the equilibrium proportion of surviving patients occupying each disease state in a Markov chain with death. Such distributions can quantitatively describe risk stratification. The authors first establish conditions for the existence of a positive limiting conditional distribution in a general Markov chain and describe a framework for risk stratification using the limiting conditional distribution. They then apply their framework to a clinical example of a treatment indicated for high-risk patients, first to infer the risk of patients selected for treatment in clinical trials and then to predict the outcomes of expanding treatment to other populations of risk. For the general chain, a positive limiting conditional distribution exists only if patients in the earliest state have the lowest combined risk of progression or death. The authors show that in their general framework, outcomes and population risk are interchangeable. For the clinical example, they estimate that previous clinical trials have selected the upper quintile of patient risk for this treatment, but they also show that expanded treatment would weakly dominate this degree of targeted treatment, and universal treatment may be cost-effective. Limiting conditional distributions exist in most Markov models of progressive diseases and are well suited to represent risk stratification quantitatively. This framework can characterize patient risk in clinical trials and predict outcomes for other populations of risk.

  1. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  2. Quantitative prediction of oral cancer risk in patients with oral leukoplakia.

    PubMed

    Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng

    2017-07-11

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.

  3. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    PubMed

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  4. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  5. Asbestos exposure--quantitative assessment of risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, J.M.; Weill, H.

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under considerationmore » by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.« less

  6. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  7. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  8. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  9. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  10. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  11. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  12. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  13. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. Copyright 2009 Elsevier B.V. All rights reserved.

  14. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  15. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  16. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  17. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  18. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  19. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  20. Characterizing health risks associated with recreational swimming at Taiwanese beaches by using quantitative microbial risk assessment.

    PubMed

    Jang, Cheng-Shin; Liang, Ching-Ping

    2018-01-01

    Taiwan is surrounded by oceans, and therefore numerous pleasure beaches attract millions of tourists annually to participate in recreational swimming activities. However, impaired water quality because of fecal pollution poses a potential threat to the tourists' health. This study probabilistically characterized the health risks associated with recreational swimming engendered by waterborne enterococci at 13 Taiwanese beaches by using quantitative microbial risk assessment. First, data on enterococci concentrations at coastal beaches monitored by the Taiwan Environmental Protection Administration were reproduced using nonparametric Monte Carlo simulation (MCS). The ingestion volumes of recreational swimming based on uniform and gamma distributions were subsequently determined using MCS. Finally, after the distribution combination of the two parameters, the beta-Poisson dose-response function was employed to quantitatively estimate health risks to recreational swimmers. Moreover, various levels of risk to recreational swimmers were classified and spatially mapped to explore feasible recreational and environmental management strategies at the beaches. The study results revealed that although the health risks associated with recreational swimming did not exceed an acceptable benchmark of 0.019 illnesses daily at all beaches, they approached to this benchmark at certain beaches. Beaches with relatively high risks are located in Northwestern Taiwan owing to the current movements.

  1. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  2. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  3. Quantitative Gait Markers and Incident Fall Risk in Older Adults

    PubMed Central

    Holtzer, Roee; Lipton, Richard B.; Wang, Cuiling

    2009-01-01

    Background Identifying quantitative gait markers of falls in older adults may improve diagnostic assessments and suggest novel intervention targets. Methods We studied 597 adults aged 70 and older (mean age 80.5 years, 62% women) enrolled in an aging study who received quantitative gait assessments at baseline. Association of speed and six other gait markers (cadence, stride length, swing, double support, stride length variability, and swing time variability) with incident fall rate was studied using generalized estimation equation procedures adjusted for age, sex, education, falls, chronic illnesses, medications, cognition, disability as well as traditional clinical tests of gait and balance. Results Over a mean follow-up period of 20 months, 226 (38%) of the 597 participants fell. Mean fall rate was 0.44 per person-year. Slower gait speed (risk ratio [RR] per 10 cm/s decrease 1.069, 95% confidence interval [CI] 1.001–1.142) was associated with higher risk of falls in the fully adjusted models. Among six other markers, worse performance on swing (RR 1.406, 95% CI 1.027–1.926), double-support phase (RR 1.165, 95% CI 1.026–1.321), swing time variability (RR 1.007, 95% CI 1.004–1.010), and stride length variability (RR 1.076, 95% CI 1.030–1.111) predicted fall risk. The associations remained significant even after accounting for cognitive impairment and disability. Conclusions Quantitative gait markers are independent predictors of falls in older adults. Gait speed and other markers, especially variability, should be further studied to improve current fall risk assessments and to develop new interventions. PMID:19349593

  4. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  5. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  6. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  7. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    PubMed

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  8. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  9. Critical Race Quantitative Intersections: A "testimonio" Analysis

    ERIC Educational Resources Information Center

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  10. Quantitative risk assessment of human salmonellosis in Canadian broiler chicken breast from retail to consumption.

    PubMed

    Smadi, Hanan; Sargeant, Jan M

    2013-02-01

    The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.

  11. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  12. Quantitative molecular analysis in mantle cell lymphoma.

    PubMed

    Brízová, H; Hilská, I; Mrhalová, M; Kodet, R

    2011-07-01

    A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.

  13. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    PubMed

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  14. 76 FR 19311 - Update of the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From Foodborne... quantitative targets established in ``Healthy People 2010.'' In 2005, FoodNet data showed 0.30 L. monocytogenes... 4). In 2003, FDA and FSIS published a quantitative assessment of the relative risk to public health...

  15. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  16. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  17. Inconsistencies in reporting risk information: a pilot analysis of online news coverage of West Nile Virus.

    PubMed

    Birnbrauer, Kristina; Frohlich, Dennis Owen; Treise, Debbie

    2017-09-01

    West Nile Virus (WNV) has been reported as one of the worst epidemics in US history. This study sought to understand how WNV news stories were framed and how risk information was portrayed from its 1999 arrival in the US through the year 2012. The authors conducted a quantitative content analysis of online news articles obtained through Google News ( N = 428). The results of this analysis were compared to the CDC's ArboNET surveillance system. The following story frames were identified in this study: action, conflict, consequence, new evidence, reassurance and uncertainty, with the action frame appearing most frequently. Risk was communicated quantitatively without context in the majority of articles, and only in 2006, the year with the third-highest reported deaths, was risk reported with statistical accuracy. The results from the analysis indicated that at-risk communities were potentially under-informed as accurate risks were not communicated. This study offers evidence about how disease outbreaks are covered in relation to actual disease surveillance data.

  18. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  19. [Quantitative data analysis for live imaging of bone.

    PubMed

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  20. Quantitative trait nucleotide analysis using Bayesian model selection.

    PubMed

    Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D

    2005-10-01

    Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.

  1. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  2. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  3. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  4. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  5. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    PubMed

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  6. Quantitative Relationship Between Cumulative Risk Alleles Based on Genome-Wide Association Studies and Type 2 Diabetes Mellitus: A Systematic Review and Meta-analysis.

    PubMed

    Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito

    2018-01-05

    Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13-1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08-1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution.

  7. Quantitative Relationship Between Cumulative Risk Alleles Based on Genome-Wide Association Studies and Type 2 Diabetes Mellitus: A Systematic Review and Meta-analysis

    PubMed Central

    Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito

    2018-01-01

    Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13–1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08–1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution. PMID:29093303

  8. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  9. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  10. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  11. Supply chain risk management of newspaper industry: A quantitative study

    NASA Astrophysics Data System (ADS)

    Sartika, Viny; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    The newspaper industry has several distinctive features that make it stands out from other industries. The strict delivery deadline and zero inventory led to a very short time frame for production and distribution. On the other hand, there is pressure from the newsroom to encourage the start of production as slowly as possible in order to enter the news, while there is pressure from production and distribution to start production as early as possible. Supply chain risk management is needed in determining the best strategy for dealing with possible risks in the newspaper industry. In a case study of a newspaper in Surakarta, quantitative approaches are made to the newspaper supply chain risk management by calculating the expected cost of risk based on the magnitude of the impact and the probability of a risk event. From the calculation results obtained that the five risks with the highest value are newspaper delays to the end customer, broken plate, miss print, down machine, and delayed delivery of newspaper content. Then analyzed appropriate mitigation strategies to cope with such risk events.

  12. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    PubMed

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk.

  13. Quantitative Assessment of Cancer Risk from Exposure to Diesel Engine Emissions

    EPA Science Inventory

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. uman target organ dose was estimated with the aid of a comprehensive dosimetry model. This model accounted for rat-hum...

  14. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.

  15. Coffee consumption and risk of fractures: a meta-analysis

    PubMed Central

    Liu, Huifang; Yao, Ke; Zhang, Wenjie; Zhou, Jun; Wu, Taixiang

    2012-01-01

    Introduction Recent studies have indicated higher risk of fractures among coffee drinkers. To quantitatively assess the association between coffee consumption and the risk of fractures, we conducted this meta-analysis. Material and methods We searched MEDLINE and EMBASE for prospective studies reporting the risk of fractures with coffee consumption. Quality of included studies was assessed with the Newcastle Ottawa scale. We conducted a meta-analysis and a cumulative meta-analysis of relative risk (RR) for an increment of one cup of coffee per day, and explored the potential dose-response relationship. Sensitivity analysis was performed where statistical heterogeneity existed. Results We included 10 prospective studies covering 214,059 participants and 9,597 cases. There was overall 3.5% higher fracture risk for an increment of one cup of coffee per day (RR = 1.035, 95% CI: 1.019-1.052). Pooled RRs were 1.049 (95% CI: 1.022-1.077) for women and 0.910 (95% CI: 0.873-0.949) for men. Among women, RR was 1.055 (95% CI: 0.999-1.114) for younger participants, and 1.047 (95% CI: 1.016-1.080) for older ones. Cumulative meta-analysis indicated that risk estimates reached a stabilization level (RR = 1.035, 95% CI: 1.019-1.052), and it revealed a positive dose-response relationship between coffee consumption and risk of fractures either for men and women combined or women specifically. Conclusions This meta-analysis suggests an overall harm of coffee intake in increasing the risk of fractures, especially for women. But current data are insufficient to reach a convincing conclusion and further research needs to be conducted. PMID:23185185

  16. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    NASA Astrophysics Data System (ADS)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  17. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  18. Quantitative risk assessment using empirical vulnerability functions from debris flow event reconstruction

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami

    2010-05-01

    For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of

  19. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. An Quantitative Analysis Method Of Trabecular Pattern In A Bone

    NASA Astrophysics Data System (ADS)

    Idesawa, Masanor; Yatagai, Toyohiko

    1982-11-01

    Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis

  1. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  2. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  3. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  5. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention.

    PubMed

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2010-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan's current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 x 10(-8) (95th percentile: 3.20 x 10(-7)). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures.

  6. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    PubMed Central

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  7. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  8. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  9. Using quantitative risk information in decisions about statins: a qualitative study in a community setting.

    PubMed

    Polak, Louisa; Green, Judith

    2015-04-01

    A large literature informs guidance for GPs about communicating quantitative risk information so as to facilitate shared decision making. However, relatively little has been written about how patients utilise such information in practice. To understand the role of quantitative risk information in patients' accounts of decisions about taking statins. This was a qualitative study, with participants recruited and interviewed in community settings. Semi-structured interviews were conducted with 34 participants aged >50 years, all of whom had been offered statins. Data were analysed thematically, using elements of the constant comparative method. Interviewees drew frequently on numerical test results to explain their decisions about preventive medication. In contrast, they seldom mentioned quantitative risk information, and never offered it as a rationale for action. Test results were spoken of as objects of concern despite an often-explicit absence of understanding, so lack of understanding seems unlikely to explain the non-use of risk estimates. Preventive medication was seen as 'necessary' either to treat test results, or because of personalised, unequivocal advice from a doctor. This study's findings call into question the assumption that people will heed and use numerical risk information once they understand it; these data highlight the need to consider the ways in which different kinds of knowledge are used in practice in everyday contexts. There was little evidence from this study that understanding probabilistic risk information was a necessary or valued condition for making decisions about statin use. © British Journal of General Practice 2015.

  10. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  12. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  13. Quantitative risk assessment for a glass fiber insulation product.

    PubMed

    Fayerweather, W E; Bender, J R; Hadley, J G; Eastes, W

    1997-04-01

    California Proposition 65 (Prop65) provides a mechanism by which the manufacturer may perform a quantitative risk assessment to be used in determining the need for cancer warning labels. This paper presents a risk assessment under this regulation for professional and do-it-yourself insulation installers. It determines the level of insulation glass fiber exposure (specifically Owens Corning's R-25 PinkPlus with Miraflex) that, assuming a working lifetime exposure, poses no significant cancer risk under Prop65's regulations. "No significant risk" is defined under Prop65 as a lifetime risk of no more than one additional cancer case per 100,000 exposed persons, and nonsignificant exposure is defined as a working lifetime exposure associated with "no significant risk." This determination can be carried out despite the fact that the relevant underlying studies (i.e., chronic inhalation bioassays) of comparable glass wool fibers do not show tumorigenic activity. Nonsignificant exposures are estimated from (1) the most recent RCC chronic inhalation bioassay of nondurable fiberglass in rats; (2) intraperitoneal fiberglass injection studies in rats; (3) a distributional, decision analysis approach applied to four chronic inhalation rat bioassays of conventional fiberglass; (4) an extrapolation from the RCC chronic rat inhalation bioassay of durable refractory ceramic fibers; and (5) an extrapolation from the IOM chronic rat inhalation bioassay of durable E glass microfibers. When the EPA linear nonthreshold model is used, central estimates of nonsignificant exposure range from 0.36 fibers/cc (for the RCC chronic inhalation bioassay of fiberglass) through 21 fibers/cc (for the i.p. fiberglass injection studies). Lower 95% confidence bounds on these estimates vary from 0.17 fibers/cc through 13 fibers/cc. Estimates derived from the distributional approach or from applying the EPA linear nonthreshold model to chronic bioassays of durable fibers such as refractory ceramic fiber

  14. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  15. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  16. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  17. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  18. Comparing listeriosis risks in at-risk populations using a user-friendly quantitative microbial risk assessment tool and epidemiological data.

    PubMed

    Falk, L E; Fader, K A; Cui, D S; Totton, S C; Fazil, A M; Lammerding, A M; Smith, B A

    2016-10-01

    Although infection by the pathogenic bacterium Listeria monocytogenes is relatively rare, consequences can be severe, with a high case-fatality rate in vulnerable populations. A quantitative, probabilistic risk assessment tool was developed to compare estimates of the number of invasive listeriosis cases in vulnerable Canadian subpopulations given consumption of contaminated ready-to-eat delicatessen meats and hot dogs, under various user-defined scenarios. The model incorporates variability and uncertainty through Monte Carlo simulation. Processes considered within the model include cross-contamination, growth, risk factor prevalence, subpopulation susceptibilities, and thermal inactivation. Hypothetical contamination events were simulated. Results demonstrated varying risk depending on the consumer risk factors and implicated product (turkey delicatessen meat without growth inhibitors ranked highest for this scenario). The majority (80%) of listeriosis cases were predicted in at-risk subpopulations comprising only 20% of the total Canadian population, with the greatest number of predicted cases in the subpopulation with dialysis and/or liver disease. This tool can be used to simulate conditions and outcomes under different scenarios, such as a contamination event and/or outbreak, to inform public health interventions.

  19. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  20. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  1. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico.

  2. Quantitative biomechanical analysis of wrist motion in bone-trimming jobs in the meat packing industry.

    PubMed

    Marklin, R W; Monroe, J F

    1998-02-01

    This study was motivated by the serious impact that cumulative trauma disorders (CTDs) of the upper extremities have on the meat packing industry. To date, no quantitative data have been gathered on the kinematics of hand and wrist motion required in bone-trimming jobs in the red-meat packing industry and how these motions are related to the risk of CTDs. The wrist motion of bone-trimming workers from a medium-sized plant was measured, and the kinematic data were compared to manufacturing industry's preliminary wrist motion benchmarks from industrial workers who performed hand-intensive, repetitive work in jobs that were of low and high risk of hand/wrist CTDs. Results of this comparison show that numerous wrist motion variables in both the left and right hands of bone-trimming workers are in the high-risk category. This quantitative analysis provides biomechanical support for the high incidence of CTDs in the meat packing industry. The research reported in this paper established a preliminary database of wrist and hand kinematics required in bone-trimming jobs in the red-meat packing industry. This kinematic database could augment the industry's efforts to reduce the severity and cost of CTDs. Ergonomics practitioners in the industry could use the kinematic methods employed in this research to assess the CTD risk of jobs that require repetitious, hand-intensive work.

  3. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  4. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  5. Quantitative assessment of human health risk posed by polycyclic aromatic hydrocarbons in urban road dust.

    PubMed

    Ma, Yukun; Liu, An; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2017-01-01

    Among the numerous pollutants present in urban road dust, polycyclic aromatic hydrocarbons (PAHs) are among the most toxic chemical pollutants and can pose cancer risk to humans. The primary aim of the study was to develop a quantitative model to assess the cancer risk from PAHs in urban road dust based on traffic and land use factors and thereby to characterise the risk posed by PAHs in fine (<150μm) and coarse (>150μm) particles. The risk posed by PAHs was quantified as incremental lifetime cancer risk (ILCR), which was modelled as a function of traffic volume and percentages of different urban land uses. The study outcomes highlighted the fact that cancer risk from PAHs in urban road dust is primarily influenced by PAHs associated with fine solids. Heavy PAHs with 5 to 6 benzene rings, especially dibenzo[a,h]anthracene (D[a]A) and benzo[a]pyrene (B[a]P) in the mixture contribute most to the risk. The quantitative model developed based on traffic and land use factors will contribute to informed decision making in relation to the management of risk posed by PAHs in urban road dust. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  7. A Quantitative Risk Analysis of Deficient Contractor Business System

    DTIC Science & Technology

    2012-04-30

    Mathematically , Jorion’s concept of VaR looks like this: ( > ) ≤ 1 − (2) where, = ^Åèìáëáíáçå=oÉëÉ~êÅÜ=éêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=ÅÜ...presents three models for calculating VaR. The local-valuation method determines the value of a portfolio once and uses mathematical derivatives...management. In the insurance industry, actuarial data is applied to model risk and risk capital reserves are “held” to cover the expected values for

  8. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  9. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  10. The risk factors for avian influenza on poultry farms: a meta-analysis.

    PubMed

    Wang, Youming; Li, Peng; Wu, Yangli; Sun, Xiangdong; Yu, Kangzhen; Yu, Chuanhua; Qin, Aijian

    2014-11-01

    Avian influenza is a severe threat both to humans and poultry, but so far, no systematic review on the identification and evaluation of the risk factors of avian influenza infection has been published. The objective of this meta-analysis is to provide evidence for decision-making and further research on AI prevention through identifying the risk factors associated with AI infection on poultry farms. The results from 15 selected studies on risk factors for AI infections on poultry farms were analyzed quantitatively by meta-analysis. Open water source (OR=2.89), infections on nearby farms (OR=4.54), other livestock (OR=1.90) and disinfection of farm (OR=0.54) have significant association with AI infection on poultry farms. The subgroup analysis results indicate that there exist different risk factors for AI infections in different types of farms. The main risk factors for AI infection in poultry farms are environmental conditions (open water source, infections on nearby farms), keeping other livestock on the same farm and no disinfection of the farm. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  12. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J

    2016-12-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  13. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  14. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  15. Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA

    PubMed Central

    Baixauli-Pérez, Mª Piedad

    2017-01-01

    The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants. PMID:28665325

  16. Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA.

    PubMed

    Fuentes-Bargues, José Luis; González-Cruz, Mª Carmen; González-Gaya, Cristina; Baixauli-Pérez, Mª Piedad

    2017-06-30

    The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants.

  17. Testicular Dysgenesis Syndrome and the Estrogen Hypothesis: A Quantitative Meta-Analysis

    PubMed Central

    Martin, Olwenn V.; Shialis, Tassos; Lester, John N.; Scrimshaw, Mark D.; Boobis, Alan R.; Voulvoulis, Nikolaos

    2008-01-01

    Background Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. Objectives We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-α–mediated mode of action was specifically explored. Results We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. Conclusions The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population. PMID:18288311

  18. Testicular dysgenesis syndrome and the estrogen hypothesis: a quantitative meta-analysis.

    PubMed

    Martin, Olwenn V; Shialis, Tassos; Lester, John N; Scrimshaw, Mark D; Boobis, Alan R; Voulvoulis, Nikolaos

    2008-02-01

    Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-alpha-mediated mode of action was specifically explored. We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population.

  19. Integrated Quantitative Cancer Risk Assessment of Inorganic Arsenic

    EPA Science Inventory

    This paper attempts to make an integrated risk assessment of arsenic, using data on humans exposed to arsenic via inhalation and ingestion. he data useful for making an integrated analysis and data gaps are discussed. rsenic provides a rare opportunity to compare the cancer risk ...

  20. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  1. Quantitative Analysis of the Efficiency of OLEDs.

    PubMed

    Sim, Bomi; Moon, Chang-Ki; Kim, Kwon-Hyeon; Kim, Jang-Joo

    2016-12-07

    We present a comprehensive model for the quantitative analysis of factors influencing the efficiency of organic light-emitting diodes (OLEDs) as a function of the current density. The model takes into account the contribution made by the charge carrier imbalance, quenching processes, and optical design loss of the device arising from various optical effects including the cavity structure, location and profile of the excitons, effective radiative quantum efficiency, and out-coupling efficiency. Quantitative analysis of the efficiency can be performed with an optical simulation using material parameters and experimental measurements of the exciton profile in the emission layer and the lifetime of the exciton as a function of the current density. This method was applied to three phosphorescent OLEDs based on a single host, mixed host, and exciplex-forming cohost. The three factors (charge carrier imbalance, quenching processes, and optical design loss) were influential in different ways, depending on the device. The proposed model can potentially be used to optimize OLED configurations on the basis of an analysis of the underlying physical processes.

  2. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  3. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  4. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  5. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  6. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  8. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  9. Deficient Contractor Business Systems: Applying the Value at Risk (VAR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-01

    measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review

  10. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  11. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  12. Fruit and vegetable intake and prostate cancer risk: a meta-analysis.

    PubMed

    Meng, Hongzhou; Hu, Wenyi; Chen, Zhaodian; Shen, Yuehong

    2014-06-01

    Recent reports have examined the effect of fruit and vegetable intake on the risk of prostate cancer, but the results are inconsistent. A meta-analysis of prospective studies was conducted to arrive at quantitative conclusions about the contribution of vegetable and fruit intake to the incidence of prostate cancer. A comprehensive, systematic search of medical literature published up to June 2012 was performed to identify relevant studies. Separate meta-analyses were conducted for fruit and vegetable consumption. The presence of publication bias was assessed using Egger and Begg tests. In total, 16 cohort studies met the inclusion criteria and were included in the meta-analysis. The combined adjusted relative risk comparing highest with lowest categories showed that there was no association between vegetable and fruit consumption and prostate cancer incidence. The pooled relative risk was 0.97 (95%CI 0.93, 1.01) for vegetables and 1.02 (95%CI 0.98, 1.07) for fruit. There is no heterogeneity between the studies. No publication bias was detected. This meta-analysis suggests that total fruit or vegetable consumption may not exert a protective role in the risk of prostate cancer. © 2013 Wiley Publishing Asia Pty Ltd.

  13. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  14. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  15. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen

  16. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  17. Anesthesia patient risk: a quantitative approach to organizational factors and risk management options.

    PubMed

    Paté-Cornell, M E; Lakats, L M; Murphy, D M; Gaba, D M

    1997-08-01

    The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the "state of the anesthesiologist" characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.

  18. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  19. Genome-wide Linkage Analysis for Identifying Quantitative Trait Loci Involved in the Regulation of Lipoprotein a (Lpa) Levels

    PubMed Central

    López, Sonia; Buil, Alfonso; Ordoñez, Jordi; Souto, Juan Carlos; Almasy, Laura; Lathrop, Mark; Blangero, John; Blanco-Vaca, Francisco; Fontcuberta, Jordi; Soria, José Manuel

    2009-01-01

    Lipoprotein Lp(a) levels are highly heritable and are associated with cardiovascular risk. We performed a genome-wide linkage analysis to delineate the genomic regions that influence the concentration of Lp(a) in families from the Genetic Analysis of Idiopathic Thrombophilia (GAIT) Project. Lp(a) levels were measured in 387 individuals belonging to 21 extended Spanish families. A total of 485 DNA microsatellite markers were genotyped to provide a 7.1 cM genetic map. A variance component linkage method was used to evaluate linkage and to detect quantitative trait loci (QTLs). The main QTL that showed strong evidence of linkage with Lp(a) levels was located at the structural gene for apo(a) on Chromosome 6 (LOD score=13.8). Interestingly, another QTL influencing Lp(a) concentration was located on Chromosome 2 with a LOD score of 2.01. This region contains several candidate genes. One of them is the tissue factor pathway inhibitor (TFPI), which has antithrombotic action and also has the ability to bind lipoproteins. However, quantitative trait association analyses performed with 12 SNPs in TFPI gene revealed no association with Lp(a) levels. Our study confirms previous results on the genetic basis of Lp(a) levels. In addition, we report a new QTL on Chromosome 2 involved in the quantitative variation of Lp(a). These data should serve as the basis for further detection of candidate genes and to elucidate the relationship between the concentration of Lp(a) and cardiovascular risk. PMID:18560444

  20. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  1. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  2. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  3. The health impact of trade and investment agreements: a quantitative systematic review and network co-citation analysis.

    PubMed

    Barlow, Pepita; McKee, Martin; Basu, Sanjay; Stuckler, David

    2017-03-08

    Regional trade agreements are major international policy instruments that shape macro-economic and political systems. There is widespread debate as to whether and how these agreements pose risks to public health. Here we perform a comprehensive systematic review of quantitative studies of the health impact of trade and investment agreements. We identified studies from searches in PubMed, Web of Science, EMBASE, and Global Health Online. Research articles were eligible for inclusion if they were quantitative studies of the health impacts of trade and investment agreements or policy. We systematically reviewed study findings, evaluated quality using the Quality Assessment Tool from the Effective Public Health Practice Project, and performed network citation analysis to study disciplinary siloes. Seventeen quantitative studies met our inclusion criteria. There was consistent evidence that implementing trade agreements was associated with increased consumption of processed foods and sugar-sweetened beverages. Granting import licenses for patented drugs was associated with increased access to pharmaceuticals. Implementing trade agreements and associated policies was also correlated with higher cardiovascular disease incidence and higher Body Mass Index (BMI), whilst correlations with tobacco consumption, under-five mortality, maternal mortality, and life expectancy were inconclusive. Overall, the quality of studies is weak or moderately weak, and co-citation analysis revealed a relative isolation of public health from economics. We identified limitations in existing studies which preclude definitive conclusions of the health impacts of regional trade and investment agreements. Few address unobserved confounding, and many possible consequences and mechanisms linking trade and investment agreements to health remain poorly understood. Results from our co-citation analysis suggest scope for greater interdisciplinary collaboration. Notwithstanding these limitations, our

  4. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  5. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  6. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  7. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  8. Migraine Headache and Ischemic Stroke Risk: An Updated Meta-analysis

    PubMed Central

    Spector, June T.; Kahn, Susan R.; Jones, Miranda R.; Jayakumar, Monisha; Dalal, Deepan; Nazarian, Saman

    2010-01-01

    Background Observational studies, including recent large cohort studies which were unavailable for prior meta-analysis, have suggested an association between migraine headache and ischemic stroke. We performed an updated meta-analysis to quantitatively summarize the strength of association between migraine and ischemic stroke risk. Methods We systematically searched electronic databases, including MEDLINE and EMBASE, through February 2009 for studies of human subjects in the English language. Study selection using a priori selection criteria, data extraction, and assessment of study quality were conducted independently by reviewer pairs using standardized forms. Results Twenty-one (60%) of 35 studies met the selection criteria, for a total of 622,381 participants (13 case-control, 8 cohort studies) included in the meta-analysis. The pooled adjusted odds ratio of ischemic stroke comparing migraineurs to non-migraineurs using a random effects model was 2.30 (95% confidence interval [CI], 1.91-2.76). The pooled adjusted effect estimates for studies that reported relative risks and hazard ratios, respectively, were 2.41 (95% CI, 1.81-3.20) and 1.52 (95% CI, 0.99-2.35). The overall pooled effect estimate was 2.04 (95% CI, 1.72-2.43). Results were robust to sensitivity analyses excluding lower quality studies. Conclusions Migraine is associated with increased ischemic stroke risk. These findings underscore the importance of identifying high-risk migraineurs with other modifiable stroke risk factors. Future studies of the effect of migraine treatment and modifiable risk factor reduction on stroke risk in migraineurs are warranted. PMID:20493462

  9. Qualitative and quantitative analysis of palmar dermatoglyphics among smokeless tobacco users.

    PubMed

    Vijayaraghavan, Athreya; Aswath, Nalini

    2015-01-01

    Palm prints formed once does not change throughout life and is not influenced by environment. Palmar Dermatoglyphics can indicate the development of potentially malignant and malignant lesions and help in identifying persons at high risk of developing Oral submucous fibrosis (OSMF) and Oral squamous cell carcinoma (OSSC). To analyze the qualitative [finger ridge pattern and presence or absence of hypothenar pattern] and quantitative [mean ATD angle and total AB ridge count] variations in Palmar Dermatoglyphics in patients suffering from OSMF and OSCC. A prospective comparative study among 40 patients (Group I--10 samples of smokeless tobacco users with OSMF, Group II--10 samples of smokeless tobacco users with OSCC, Group III--10 samples of smokeless tobacco users without OSMF or OSCC and Group IV--10 samples without smokeless tobacco habit without OSMF and OSCC as controls) were selected. The palm prints were recorded using an HP inkjet scanner. The patients were asked to place the palm gently on the scanner with the fingers wide apart from each other. The images of the palm prints were edited and qualitative and quantitative analysis were done. Statistical analysis such as Kruskal Wallis, Post Hoc and Analysis of Varience were done. A highly significant difference among the finger ridge, hypothenar pattern and mean ATD angle (P<0.001) and total AB ridge count (P=0.005) in OSMF and OSCC patients were obtained. There is predominance of arches and loops, presence of hypothenar pattern, decrease in mean ATD angle and total AB ridge count in OSMF and Oral Cancer patients. Palmar Dermatoglyphics can predict the probable occurrence of OSMF and OSCC in smokelees tobacco users.

  10. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  11. Clinical and quantitative analysis of patients with crowned dens syndrome.

    PubMed

    Takahashi, Teruyuki; Tamura, Masato; Takasu, Toshiaki; Kamei, Satoshi

    2017-05-15

    Crowned dens syndrome (CDS) is a radioclinical entity defined by calcium deposition on the transverse ligament of atlas (TLA). In this study, the novel semi-quantitative diagnostic criteria for CDS to evaluate the degree of calcification on TLA by cervical CT are proposed. From January 2010 to September 2014, 35 patients who were diagnosed with CDS by cervical CT were adopted as subjects in this study. Based on novel criteria, calcium deposition on TLA was classified into "Stage" and "Grade", to make a score, which was evaluated semi-quantitatively. The correlation between calcification score and CRP level or pain score, and the effects of treatments, such as NSAIDs and corticosteroids, were statistically analyzed. The total calcification score from added "Stage" and "Grade" scores demonstrated a significantly strong and linear correlation with CRP level (R 2 =0.823, **p<0.01). In the multiple comparison test for the treatment effects, significant improvement of the CRP level and pain score were demonstrated after corticosteroid therapy (**p<0.01) compared with NSAIDs. In the conditional logistic regression analysis, the rapid end of corticosteroid therapy was an independent risk factor for relapse of cervico-occipital pain [OR=50.761, *p=0.0419]. The degree of calcification on TLA evaluated by the novel semi-quantitative criteria significantly correlated with CRP level. In the treatment of CDS, it is recommended that a low dosage (15-30mg) of corticosteroids be used as first-line drugs rather than conventional NSAID therapy. Additionally, it is also recommended to gradually decrease the dosage of corticosteroids. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Dose-Dependent Associations between Wine Drinking and Breast Cancer Risk - Meta-Analysis Findings.

    PubMed

    Chen, Jia-Yan; Zhu, Hong-Cheng; Guo, Qing; Shu, Zheng; Bao, Xu-Hui; Sun, Feng; Qin, Qin; Yang, Xi; Zhang, Chi; Cheng, Hong-Yan; Sun, Xin-Chen

    2016-01-01

    To investigate any potential association between wine and breast cancer risk. We quantitatively assessed associations by conducting a meta-analysis based on evidence from observational studies. In May 2014, we performed electronic searches in PubMed, EmBase and the Cochrane Library to identify studies examining the effect of wine drinking on breast cancer incidence. The relative risk (RR) or odds ratio (OR) were used to measure any such association. The analysis was further stratified by confounding factors that could influence the results. A total of twenty-six studies (eight case-control and eighteen cohort studies) involving 21,149 cases were included in our meta-analysis. Our study demonstrated that wine drinking was associated with breast cancer risk. A 36% increase in breast cancer risk was observed across overall studies based on the highest versus lowest model, with a combined RR of 1.0059 (95%CI 0.97-1.05) in dose-response analysis. However, 5 g/d ethanol from wine seemed to have protective value from our non-linear model. Our findings indicate that wine drinking is associated with breast cancer risk in a dose-dependent manner. High consumption of wine contributes to breast cancer risk with protection exerted by low doses. Further investigations are needed for clarification.

  13. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    PubMed

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  14. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Household physical activity and cancer risk: a systematic review and dose-response meta-analysis of epidemiological studies

    PubMed Central

    Shi, Yun; Li, Tingting; Wang, Ying; Zhou, Lingling; Qin, Qin; Yin, Jieyun; Wei, Sheng; Liu, Li; Nie, Shaofa

    2015-01-01

    Controversial results of the association between household physical activity and cancer risk were reported among previous epidemiological studies. We conducted a meta-analysis to investigate the relationship of household physical activity and cancer risk quantitatively, especially in dose-response manner. PubMed, Embase, Web of science and the Cochrane Library were searched for cohort or case-control studies that examined the association between household physical activity and cancer risks. Random–effect models were conducted to estimate the summary relative risks (RRs), nonlinear or linear dose–response meta-analyses were performed to estimate the trend from the correlated log RR estimates across levels of household physical activity quantitatively. Totally, 30 studies including 41 comparisons met the inclusion criteria. Total cancer risks were reduced 16% among the people with highest household physical activity compared to those with lowest household physical activity (RR = 0.84, 95% CI = 0.76–0.93). The dose-response analyses indicated an inverse linear association between household physical activity and cancer risk. The relative risk was 0.98 (95% CI = 0.97–1.00) for per additional 10 MET-hours/week and it was 0.99 (95% CI = 0.98–0.99) for per 1 hour/week increase. These findings provide quantitative data supporting household physical activity is associated with decreased cancer risk in dose-response effect. PMID:26443426

  16. Quantitative Microbial Risk Assessment of Pharmaceutical Products.

    PubMed

    Eissa, Mostafa Essam

    2017-01-01

    Monitoring of microbiological quality in the pharmaceutical industry is an important criterion that is required to justify safe product release to the drug market. Good manufacturing practice and efficient control on bioburden level of product components are critical parameters that influence the microbiological cleanliness of medicinal products. However, because microbial dispersion through the samples follows Poisson distribution, the rate of detection of microbiologically defective samples lambda (λ) decreases when the number of defective units per batch decreases. When integrating a dose-response model of infection (P inf ) of a specific objectionable microbe with a contamination module, the overall probability of infection from a single batch of pharmaceutical product can be estimated. The combination of P inf with detectability chance of the test (P det ) will yield a value that could be used as a quantitative measure of the possibility of passing contaminated batch units of product with a certain load of a specific pathogen and infecting the final consumer without being detected in the firm. The simulation study can be used to assess the risk of contamination and infection from objectionable microorganisms for sterile and non-sterile products. LAY ABSTRACT: Microbial contamination of pharmaceutical products is a global problem that may lead to infection and possibly death. While reputable pharmaceutical companies strive to deliver microbiologically safe products, it would be helpful to apply an assessment system for the current risk associated with pharmaceutical batches delivered to the drug market. The current methodology may be helpful also in determining the degree of improvement or deterioration on the batch processing flow until reaching the final consumer. Moreover, the present system is flexible and can be applied to other industries such as food, cosmetics, or medical devices manufacturing and processing fields to assess the microbiological risk of

  17. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  18. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  19. Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature.

    PubMed

    Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier

    2016-02-01

    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens

  20. Depression as a risk factor for dementia and mild cognitive impairment: a meta-analysis of longitudinal studies.

    PubMed

    Gao, Yuan; Huang, Changquan; Zhao, Kexiang; Ma, Louyan; Qiu, Xuan; Zhang, Lei; Xiu, Yun; Chen, Lin; Lu, Wei; Huang, Chunxia; Tang, Yong; Xiao, Qian

    2013-05-01

    This study examined whether depression was a risk factor for onset of dementia including Alzheimer's disease (AD), vascular dementia (VD) and any dementia, and mild cognitive impairment (MCI) by using a quantitative meta-analysis of longitudinal studies. EMBASE and MEDLINE were searched for articles published up to February 2011. All studies that examined the relationship between depression and the onset of dementia or MCI were included. Pooled relative risk was calculated using fixed-effects models. Twelve studies met our inclusion criteria for this meta-analysis. All subjects were without dementia or MCI at baseline. Four, two, five, and four studies compared the incidence of AD, VD, any dementia, and MCI between subjects with or without depression, respectively. After pooling all the studies, subjects with depression had higher incidence of AD (relative risk (RR):1.66, 95% confidence interval (CI): 1.29-2.14), VD (RR: 1.89, 95% CI: 1.19-3.01), any dementia (RR: 1.55, 95% CI: 1.31-2.83), and MCI (RR: 1.97, 95% CI: 1.53-2.54) than those without depression. The quantitative meta-analysis showed that depression was a major risk factor for incidence of dementia (including AD, VD, and any dementia) and MCI. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  2. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Reliability and safety, and the risk of construction damage in mining areas

    NASA Astrophysics Data System (ADS)

    Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz

    2018-04-01

    This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.

  4. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas.

  5. Application of preliminary risk analysis at marble finishing plants in Recife's metropolitan area.

    PubMed

    de Melo Neto, Rútilo P; Kohlman Rabbani, Emilia R

    2012-01-01

    The finishing of marble occurs in quarries all over Brazil, being the most significant dimension of the ornamental stone sector, with 7,000 businesses. Recife's Metropolitan Area (RMR) contains approximately 106 marble quarries, 25 of them unionized. The study focused on the application of Preliminary Risk Analysis, conducted at two unionized quarries: M1, a small business; and the second, M2, considered a micro enterprise. In this analysis both the administrative and the productive sectors were evaluated. The fieldwork was done in the month of December 2010. The study revealed that the two quarries carried moderate risks in the administrative sector, mainly due to ergonomic factors, and that in the productive sectors the risks were high, specifically because of excess noise, dust material, and precarious electrical installations. Using the results of the qualitative analysis as a base, the need for quantitative study presents itself in order to determine the most adequate modes of protection to be of assistance in the management of these risks, guaranteeing the safety and health of the worker and consequently the improvement in productivity in this sector.

  6. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  7. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  8. Lung Cancer Risk in Painters: A Meta-Analysis

    PubMed Central

    Guha, Neela; Merletti, Franco; Steenland, Nelson Kyle; Altieri, Andrea; Cogliano, Vincent; Straif, Kurt

    2010-01-01

    Objective We conducted a meta-analysis to quantitatively compare the association between occupation as a painter and the incidence or mortality from lung cancer. Data sources PubMed and the reference lists of pertinent publications were searched and reviewed. For the meta-analysis, we used data from 47 independent cohort, record linkage, and case–control studies (from a total of 74 reports), including > 11,000 incident cases or deaths from lung cancer among painters. Data extraction Three authors independently abstracted data and assessed study quality. Data synthesis The summary relative risk (meta-RR, random effects) for lung cancer in painters was 1.35 [95% confidence interval (CI), 1.29–1.41; 47 studies] and 1.35 (95% CI, 1.21–1.51; 27 studies) after controlling for smoking. The relative risk was higher in never-smokers (meta-RR = 2.00; 95% CI, 1.09–3.67; 3 studies) and persisted when restricted to studies that adjusted for other occupational exposures (meta-RR = 1.57; 95% CI, 1.21–2.04; 5 studies). The results remained robust when stratified by study design, sex, and study location and are therefore unlikely due to chance or bias. Furthermore, exposure–response analyses suggested that the risk increased with duration of employment. Conclusion These results support the conclusion that occupational exposures in painters are causally associated with the risk of lung cancer. PMID:20064777

  9. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration. © 2016 Society for Risk Analysis.

  10. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  11. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  12. Analysis of perceived risk among construction workers: a cross-cultural study and reflection on the Hofstede model.

    PubMed

    Martinez-Fiestas, Myriam; Rodríguez-Garzón, Ignacio; Delgado-Padial, Antonio; Lucas-Ruiz, Valeriano

    2017-09-01

    This article presents a cross-cultural study on perceived risk in the construction industry. Worker samples from three different countries were studied: Spain, Peru and Nicaragua. The main goal was to explain how construction workers perceive their occupational hazard and to analyze how this is related to their national culture. The model used to measure perceived risk was the psychometric paradigm. The results show three very similar profiles, indicating that risk perception is independent of nationality. A cultural analysis was conducted using the Hofstede model. The results of this analysis and the relation to perceived risk showed that risk perception in construction is independent of national culture. Finally, a multiple lineal regression analysis was conducted to determine what qualitative attributes could predict the global quantitative size of risk perception. All of the findings have important implications regarding the management of safety in the workplace.

  13. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  15. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. © 2015 Blackwell Verlag GmbH.

  16. Defining High-Risk Precursor Signaling to Advance Breast Cancer Risk Assessment and Prevention

    DTIC Science & Technology

    2017-03-01

    KEYWORDS: 3. ACCOMPLISHMENTS: Aim 1: Functional analysis of progenitor and stem cells in high-risk tissues. Major Task 1Functional...and stem cells in high-risk tissues. Major Task 1: Quantitation of LP (Luminal Progenitor) and basal stem cell (MASC) populations A. Quantitation of...LP and basal stem cell (MASC) populations We have continued to add patients to the cohorts between months 12 and 24. (This reporting period

  17. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  18. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  19. Quantitative proteomics analysis using 2D-PAGE to investigate the effects of cigarette smoke and aerosol of a prototypic modified risk tobacco product on the lung proteome in C57BL/6 mice.

    PubMed

    Elamin, Ashraf; Titz, Bjoern; Dijon, Sophie; Merg, Celine; Geertz, Marcel; Schneider, Thomas; Martin, Florian; Schlage, Walter K; Frentzel, Stefan; Talamo, Fabio; Phillips, Blaine; Veljkovic, Emilija; Ivanov, Nikolai V; Vanscheeuwijck, Patrick; Peitsch, Manuel C; Hoeng, Julia

    2016-08-11

    Smoking is associated with several serious diseases, such as lung cancer and chronic obstructive pulmonary disease (COPD). Within our systems toxicology framework, we are assessing whether potential modified risk tobacco products (MRTP) can reduce smoking-related health risks compared to conventional cigarettes. In this article, we evaluated to what extent 2D-PAGE/MALDI MS/MS (2D-PAGE) can complement the iTRAQ LC-MS/MS results from a previously reported mouse inhalation study, in which we assessed a prototypic MRTP (pMRTP). Selected differentially expressed proteins identified by both LC-MS/MS and 2D-PAGE approaches were further verified using reverse-phase protein microarrays. LC-MS/MS captured the effects of cigarette smoke (CS) on the lung proteome more comprehensively than 2D-PAGE. However, an integrated analysis of both proteomics data sets showed that 2D-PAGE data complement the LC-MS/MS results by supporting the overall trend of lower effects of pMRTP aerosol than CS on the lung proteome. Biological effects of CS exposure supported by both methods included increases in immune-related, surfactant metabolism, proteasome, and actin cytoskeleton protein clusters. Overall, while 2D-PAGE has its value, especially as a complementary method for the analysis of effects on intact proteins, LC-MS/MS approaches will likely be the method of choice for proteome analysis in systems toxicology investigations. Quantitative proteomics is anticipated to play a growing role within systems toxicology assessment frameworks in the future. To further understand how different proteomics technologies can contribute to toxicity assessment, we conducted a quantitative proteomics analysis using 2D-PAGE and isobaric tag-based LC-MS/MS approaches and compared the results produced from the 2 approaches. Using a prototypic modified risk tobacco product (pMRTP) as our test item, we show compared with cigarette smoke, how 2D-PAGE results can complement and support LC-MS/MS data, demonstrating

  20. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    PubMed

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    NASA Astrophysics Data System (ADS)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  2. Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang

    Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less

  3. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boquerón ...

  4. Quantitative analysis of pork and chicken products by droplet digital PCR.

    PubMed

    Cai, Yicun; Li, Xiang; Lv, Rong; Yang, Jielin; Li, Jian; He, Yuping; Pan, Liangwen

    2014-01-01

    In this project, a highly precise quantitative method based on the digital polymerase chain reaction (dPCR) technique was developed to determine the weight of pork and chicken in meat products. Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of species-specific DNAs in meat products. However, it is limited in amplification efficiency and relies on standard curves based Ct values, detecting and quantifying low copy number target DNA, as in some complex mixture meat products. By using the dPCR method, we find the relationships between the raw meat weight and DNA weight and between the DNA weight and DNA copy number were both close to linear. This enabled us to establish formulae to calculate the raw meat weight based on the DNA copy number. The accuracy and applicability of this method were tested and verified using samples of pork and chicken powder mixed in known proportions. Quantitative analysis indicated that dPCR is highly precise in quantifying pork and chicken in meat products and therefore has the potential to be used in routine analysis by government regulators and quality control departments of commercial food and feed enterprises.

  5. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  6. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    PubMed

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  8. Quantitative genetic analysis of injury liability in infants and toddlers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries.more » Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.« less

  9. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  10. Quantitative cancer risk assessment for occupational exposures to asphalt fumes during built-up roofing asphalt (BURA) operations.

    PubMed

    Rhomberg, Lorenz R; Mayfield, David B; Goodman, Julie E; Butler, Eric L; Nascarella, Marc A; Williams, Daniel R

    2015-01-01

    The International Agency for Research on Cancer qualitatively characterized occupational exposure to oxidized bitumen emissions during roofing as probably carcinogenic to humans (Group 2A). We examine chemistry, exposure, epidemiology and animal toxicity data to explore quantitative risks for roofing workers applying built-up roofing asphalt (BURA). Epidemiology studies do not consistently report elevated risks, and generally do not have sufficient exposure information or adequately control for confounders, precluding their use for dose-response analysis. Dermal carcinogenicity bioassays using mice report increased tumor incidence with single high doses. In order to quantify potential cancer risks, we develop time-to-tumor model methods [consistent with U.S. Environmental Protection Agency (EPA) dose-response analysis and mixtures guidelines] using the dose-time-response shape of concurrent exposures to benzo[a]pyrene (B[a]P) as concurrent controls (which had several exposure levels) to infer presumed parallel dose-time-response curves for BURA-fume condensate. We compare EPA relative potency factor approaches, based on observed relative potency of BURA to B[a]P in similar experiments, and direct observation of the inferred BURA dose-time-response (scaled to humans) as means for characterizing a dermal unit risk factor. We apply similar approaches to limited data on asphalt-fume inhalation and respiratory cancers in rats. We also develop a method for adjusting potency estimates for asphalts that vary in composition using measured fluorescence. Overall, the various methods indicate that cancer risks to roofers from both dermal and inhalation exposure to BURA are within a range typically deemed acceptable within regulatory frameworks. The approaches developed may be useful in assessing carcinogenic potency of other complex mixtures of polycyclic aromatic compounds.

  11. An ounce of prevention or a pound of cure: bioeconomic risk analysis of invasive species.

    PubMed

    Leung, Brian; Lodge, David M; Finnoff, David; Shogren, Jason F; Lewis, Mark A; Lamberti, Gary

    2002-12-07

    Numbers of non-indigenous species--species introduced from elsewhere - are increasing rapidly worldwide, causing both environmental and economic damage. Rigorous quantitative risk-analysis frameworks, however, for invasive species are lacking. We need to evaluate the risks posed by invasive species and quantify the relative merits of different management strategies (e.g. allocation of resources between prevention and control). We present a quantitative bioeconomic modelling framework to analyse risks from non-indigenous species to economic activity and the environment. The model identifies the optimal allocation of resources to prevention versus control, acceptable invasion risks and consequences of invasion to optimal investments (e.g. labour and capital). We apply the model to zebra mussels (Dreissena polymorpha), and show that society could benefit by spending up to US$324 000 year(-1) to prevent invasions into a single lake with a power plant. By contrast, the US Fish and Wildlife Service spent US$825 000 in 2001 to manage all aquatic invaders in all US lakes. Thus, greater investment in prevention is warranted.

  12. Approaches to acceptable risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whipple, C

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less

  13. A Quantitative Threats Analysis for the Florida Manatee (Trichechus manatus latirostris)

    USGS Publications Warehouse

    Runge, Michael C.; Sanders-Reed, Carol A.; Langtimm, Catherine A.; Fonnesbeck, Christopher J.

    2007-01-01

    The Florida manatee (Trichechus manatus latirostris) is an endangered marine mammal endemic to the southeastern United States. The primary threats to manatee populations are collisions with watercraft and the potential loss of warm-water refuges. For the purposes of listing, recovery, and regulation under the Endangered Species Act (ESA), an understanding of the relative effects of the principal threats is needed. This work is a quantitative approach to threats analysis, grounded in the assumption that an appropriate measure of status under the ESA is based on the risk of extinction, as quantified by the probability of quasi-extinction. This is related to the qualitative threats analyses that are more common under the ESA, but provides an additional level of rigor, objectivity, and integration. In this approach, our philosophy is that analysis of the five threat factors described in Section 4(a)(1) of the ESA can be undertaken within an integrated quantitative framework. The basis of this threats analysis is a comparative population viability analysis. This involves forecasting the Florida manatee population under different scenarios regarding the presence of threats, while accounting for process variation (environmental, demographic, and catastrophic stochasticity) as well as parametric and structural uncertainty. We used the manatee core biological model (CBM) for this viability analysis, and considered the role of five threats: watercraft-related mortality, loss of warm-water habitat in winter, mortality in water-control structures, entanglement, and red tide. All scenarios were run with an underlying parallel structure that allowed a more powerful estimation of the effects of the various threats. The results reflect our understanding of manatee ecology (as captured in the structure of the CBM), our estimates of manatee demography (as described by the parameters in the model), and our characterization of the mechanisms by which the threats act on manatees. As an

  14. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  15. Cluster analysis of bone microarchitecture from high resolution peripheral quantitative computed tomography demonstrates two separate phenotypes associated with high fracture risk in men and women.

    PubMed

    Edwards, M H; Robinson, D E; Ward, K A; Javaid, M K; Walker-Bone, K; Cooper, C; Dennison, E M

    2016-07-01

    Osteoporosis is a major healthcare problem which is conventionally assessed by dual energy X-ray absorptiometry (DXA). New technologies such as high resolution peripheral quantitative computed tomography (HRpQCT) also predict fracture risk. HRpQCT measures a number of bone characteristics that may inform specific patterns of bone deficits. We used cluster analysis to define different bone phenotypes and their relationships to fracture prevalence and areal bone mineral density (BMD). 177 men and 159 women, in whom fracture history was determined by self-report and vertebral fracture assessment, underwent HRpQCT of the distal radius and femoral neck DXA. Five clusters were derived with two clusters associated with elevated fracture risk. "Cluster 1" contained 26 women (50.0% fractured) and 30 men (50.0% fractured) with a lower mean cortical thickness and cortical volumetric BMD, and in men only, a mean total and trabecular area more than the sex-specific cohort mean. "Cluster 2" contained 20 women (50.0% fractured) and 14 men (35.7% fractured) with a lower mean trabecular density and trabecular number than the sex-specific cohort mean. Logistic regression showed fracture rates in these clusters to be significantly higher than the lowest fracture risk cluster [5] (p<0.05). Mean femoral neck areal BMD was significantly lower than cluster 5 in women in cluster 1 and 2 (p<0.001 for both), and in men, in cluster 2 (p<0.001) but not 1 (p=0.220). In conclusion, this study demonstrates two distinct high risk clusters in both men and women which may differ in etiology and response to treatment. As cluster 1 in men does not have low areal BMD, these men may not be identified as high risk by conventional DXA alone. Copyright © 2016. Published by Elsevier Inc.

  16. Education and Risk of Dementia: Dose-Response Meta-Analysis of Prospective Cohort Studies.

    PubMed

    Xu, Wei; Tan, Lan; Wang, Hui-Fu; Tan, Meng-Shan; Tan, Lin; Li, Jie-Qiong; Zhao, Qing-Fei; Yu, Jin-Tai

    2016-07-01

    Educational level has been regarded as one of the most widely accepted risk factors in the epidemiological studies for dementia, despite with discordant qualitative results. However, the dose-response relation between education and incident dementia was still unknown. To quantitatively evaluate the association between exposure level to high and low education and risk of dementia, we searched PubMed, EMBASE, and the Cochrane Library up to November 2014 and references of retrieved literatures. Specific prospective cohort studies, in which educational attainment was categorized into at least three levels, were included. Newcastle-Ottawa scale was used to assess the quality of included studies. Fifteen prospective cohort studies with 55655 for low education and eight prospective cohort studies with 20172 for high education were included. In the qualitative analysis, both low and high education showed a dose-response trend with risk of dementia and Alzheimer's disease (AD). In the quantitative analysis, the dementia risk was reduced by 7 % for per year increase in education (RR, 0.93; 95 % CI, 0.92-0.94; p for overall trend = 0.000; p for nonlinearity = 0.0643). Nonetheless, we did not find statistically significant association between per year decrease in education and dementia (RR, 1.03; 95 % CI, 0.96-1.10; p for overall trend = 0.283; p for nonlinearity = 0.0041) or AD (RR, 1.03; 95 % CI, 0.97-1.10; p for overall trend = 0.357; p for nonlinearity = 0.0022). Both low and high education showed a trend of dose-response relation with risk of dementia and AD. The dementia risk was reduced by 7 % for per year increase in education.

  17. Quantitative Microbial Risk Assessment and Infectious Disease Transmission Modeling of Waterborne Enteric Pathogens.

    PubMed

    Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S

    2018-04-20

    Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.

  18. Quantitative risk assessment of CO2 transport by pipelines--a review of uncertainties and their impacts.

    PubMed

    Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André

    2010-05-15

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  19. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  20. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  1. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  2. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  3. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  4. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Quantitative analysis on PUVA-induced skin photodamages using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Zhai, Juan; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Zeng, Changchun; Jin, Ying

    2009-08-01

    Psoralen plus ultraviolet A radiation (PUVA) therapy is a very important clinical treatment of skin diseases such as vitiligo and psoriasis, but associated with an increased risk of skin photodamages especially photoaging. Since skin biopsy alters the original skin morphology and always requires an iatrogenic trauma, optical coherence tomography (OCT) appears to be a promising technique to study skin damage in vivo. In this study, the Balb/c mice had 8-methoxypsralen (8-MOP) treatment prior to UVA radiation was used as PUVA-induced photo-damaged modal. The OCT imaging of photo-damaged group (modal) and normal group (control) in vivo was obtained of mice dorsal skin at 0, 24, 48, 72 hours after irradiation respectively. And then the results were quantitatively analyzed combined with histological information. The experimental results showed that, PUVA-induced photo-damaged skin had an increase in epidermal thickness (ET), a reduction of attenuation coefficient in OCT images signal, and an increase in brightness of the epidermis layer compared with the control group. In conclusion, noninvasive high-resolution imaging techniques such as OCT may be a promising tool for photobiological studies aimed at assessing photo-damage and repair processes in vivo. It can be used to quantitative analysis of changes in photo-damaged skin, such as the ET and collagen in dermis, provides a theoretical basis for treatment and prevention of skin photodamages.

  6. Quantitative High-Resolution Genomic Analysis of Single Cancer Cells

    PubMed Central

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A.; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics. PMID:22140428

  7. Quantitative high-resolution genomic analysis of single cancer cells.

    PubMed

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  8. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers--specific application to Listeria monocytogenes and ready-to-eat meat products.

    PubMed

    Mataragas, M; Zwietering, M H; Skandamis, P N; Drosinos, E H

    2010-07-31

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH=6.2-6.4 and water activity=0.98-0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combination. Copyright 2010 Elsevier B.V. All rights reserved.

  9. Lycopene Consumption and Risk of Colorectal Cancer: A Meta-Analysis of Observational Studies.

    PubMed

    Wang, Xin; Yang, Hui-Hui; Liu, Yan; Zhou, Quan; Chen, Zi-Hua

    2016-10-01

    A number of epidemiological studies have explored the association between lycopene or lycopene-rich food intake and the risk of colorectal cancer, but the results of these studies have not been consistent. We conducted a systematic review and meta-analysis of studies published in the PubMed and EMBASE databases to quantitatively assess the association between lycopene consumption and the risk of colorectal cancer. A total of 15 studies were included in the meta-analysis, and the summary relative risk (RR) for highest versus lowest category indicated no significant association between lycopene consumption and the risk of colorectal cancer [RR = 0.94, 95% confidence interval (CI): 0.80-1.10]. However, a significant inverse association was observed between lycopene consumption and the site of cancer in the colon (RR = 0.88, 95% CI: 0.81-0.96). We also found that the incidence of colon cancer and lycopene intake did not exhibit dose-response relationships. The Grades of Recommendations Assessment, Development and Evaluation (GRADE) quality in our study was very low. In conclusion, this meta-analysis indicates that lycopene consumption is not associated with the risk of colorectal cancer. Further research will be needed in this area to provide conclusive evidence.

  10. Integrated approach for confidence-enhanced quantitative analysis of herbal medicines, Cistanche salsa as a case.

    PubMed

    Liu, Wenjing; Song, Qingqing; Yan, Yu; Liu, Yao; Li, Peng; Wang, Yitao; Tu, Pengfei; Song, Yuelin; Li, Jun

    2018-08-03

    Although far away from perfect, it is practical to assess the quality of a given herbal medicine (HM) through simultaneous determination of a panel of components. However, the confidences of the quantitative outcomes from LC-MS/MS platform risk several technical barriers, such as chemical degradation, polarity range, concentration span, and identity misrecognition. Herein, we made an attempt to circumvent these obstacles by integrating several fit-for-purpose techniques, including online extraction (OLE), serially coupled reversed phase LC-hydrophilic interaction liquid chromatography (RPLC-HILIC), tailored multiple reaction monitoring (MRM), and relative response vs. collision energy curve (RRCEC) matching. Confidence-enhanced quantitative analysis of Cistanche salsa (Csa), a well-known psammophytic species and tonic herbal medicine, was conducted as a proof-of-concept. OLE module was deployed to prohibit chemical degradation, in particular E/Z-configuration transformation for phenylethanoid glycosides. Satisfactory retention took place for each analyte regardless of polarity because of successive passing through RPLC and HILIC columns. Optimum parameters for the minor components, at the meanwhile of inferior ones for the abundant ingredients, ensured the locations of all contents in the linear ranges. The unequivocal assignment of the captured signals was achieved by matching retention times, ion transitions, and more importantly, RRCECs between authentic compounds and suspect peaks. Diverse validation assays demonstrated the newly developed method to be reliable. Particularly, the distribution of mannitol rather than galactitol was disclosed although these isomers showed identical retention time and ion transitions. The contents of 21 compounds-of-interest were definitively determined in Csa as well as two analogous species, and the quantitative patterns exerted great variations among not only different species but different Csa samples. Together, the

  11. Through ARIPAR-GIS the quantified area risk analysis supports land-use planning activities.

    PubMed

    Spadoni, G; Egidi, D; Contini, S

    2000-01-07

    The paper first summarises the main aspects of the ARIPAR methodology whose steps can be applied to quantify the impact on a territory of major accident risks due to processing, storing and transporting dangerous substances. Then the capabilities of the new decision support tool ARIPAR-GIS, implementing the mentioned procedure, are described, together with its main features and types of results. These are clearly shown through a short description of the updated ARIPAR study (reference year 1994), in which the impact of changes due to industrial and transportation dynamics on the Ravenna territory in Italy were evaluated. The brief explanation of how results have been used by local administrations offers the opportunity to discuss about advantages of the quantitative area risk analysis tool in supporting activities of risk management, risk control and land-use planning.

  12. Using Formative Scenario Analysis approach for landslide risk analysis in a relatively scarce data environment: preliminary results

    NASA Astrophysics Data System (ADS)

    Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai

    2014-05-01

    It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian

  13. [Quantitative risk model for verocytotoxigenic Escherichia coli cross-contamination during homemade hamburger preparation].

    PubMed

    Signorini, M L; Frizzo, L S

    2009-01-01

    The objective of this study was to develop a quantitative risk model for verocytotoxigenic Escherichia coil (VTEC) cross-contamination during hamburger preparation at home. Published scientific information about the disease was considered for the elaboration of the model, which included a number of routines performed during food preparation in kitchens. The associated probabilities of bacterial transference between food items and kitchen utensils which best described each stage of the process were incorporated into the model by using @Risk software. Handling raw meat before preparing ready-to-eat foods (Odds ratio, OR, 6.57), as well as hand (OR = 12.02) and cutting board (OR = 5.02) washing habits were the major risk factors of VTEC cross-contamination from meat to vegetables. The information provided by this model should be considered when designing public information campaigns on hemolytic uremic syndrome risk directed to food handlers, in order to stress the importance of the above mentioned factors in disease transmission.

  14. A quantitative risk assessment for the safety of carcase storage systems for scrapie infected farms.

    PubMed

    Adkin, A; Jones, D L; Eckford, R L; Edwards-Jones, G; Williams, A P

    2014-10-01

    To determine the risk associated with the use of carcase storage vessels on a scrapie infected farm. A stochastic quantitative risk assessment was developed to determine the rate of accumulation and fate of scrapie in a novel low-input storage system. For an example farm infected with classical scrapie, a mean of 10(3·6) Ovine Oral ID50 s was estimated to accumulate annually. Research indicates that the degradation of any prions present may range from insignificant to a magnitude of one or two logs over several months of storage. For infected farms, the likely partitioning of remaining prion into the sludge phase would necessitate the safe operation and removal of resulting materials from these systems. If complete mixing could be assumed, on average, the concentrations of infectivity are estimated to be slightly lower than that measured in placenta from infected sheep at lambing. This is the first quantitative assessment of the scrapie risk associated with fallen stock on farm and provides guidance to policy makers on the safety of one type of storage system and the relative risk when compared to other materials present on an infected farm. © 2014 Crown Copyright. Journal of Applied Microbiology © 2014 Society for Applied Microbiology This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  15. Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*

    PubMed Central

    Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.

    2011-01-01

    The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID

  16. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    PubMed

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.

  17. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    PubMed

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .

  18. Comparative study of standard space and real space analysis of quantitative MR brain data.

    PubMed

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  19. Scenario analysis of freight vehicle accident risks in Taiwan.

    PubMed

    Tsai, Ming-Chih; Su, Chien-Chih

    2004-07-01

    This study develops a quantitative risk model by utilizing Generalized Linear Interactive Model (GLIM) to analyze the major freight vehicle accidents in Taiwan. Eight scenarios are established by interacting three categorical variables of driver ages, vehicle types and road types, each of which contains two levels. The database that consists of 2043 major accidents occurring between 1994 and 1998 in Taiwan is utilized to fit and calibrate the model parameters. The empirical results indicate that accident rates of freight vehicles in Taiwan were high in the scenarios involving trucks and non-freeway systems, while; accident consequences were severe in the scenarios involving mature drivers or non-freeway systems. Empirical evidences also show that there is no significant relationship between accident rates and accident consequences. This is to stress that safety studies that describe risk merely as accident rates rather than the combination of accident rates and consequences by definition might lead to biased risk perceptions. Finally, the study recommends using number of vehicle as an alternative of traffic exposure in commercial vehicle risk analysis. The merits of this would be that it is simple and thus reliable; meanwhile, the resulted risk that is termed as fatalities per vehicle could provide clear and direct policy implications for insurance practices and safety regulations.

  20. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    PubMed

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  2. Characterizing the risk of infection from Mycobacterium tuberculosis in commercial passenger aircraft using quantitative microbial risk assessment.

    PubMed

    Jones, Rachael M; Masago, Yoshifumi; Bartrand, Timothy; Haas, Charles N; Nicas, Mark; Rose, Joan B

    2009-03-01

    Quantitative microbial risk assessment was used to predict the likelihood and spatial organization of Mycobacterium tuberculosis (Mtb) transmission in a commercial aircraft. Passenger exposure was predicted via a multizone Markov model in four scenarios: seated or moving infectious passengers and with or without filtration of recirculated cabin air. The traditional exponential (k = 1) and a new exponential (k = 0.0218) dose-response function were used to compute infection risk. Emission variability was included by Monte Carlo simulation. Infection risks were higher nearer and aft of the source; steady state airborne concentration levels were not attained. Expected incidence was low to moderate, with the central 95% ranging from 10(-6) to 10(-1) per 169 passengers in the four scenarios. Emission rates used were low compared to measurements from active TB patients in wards, thus a "superspreader" emitting 44 quanta/h could produce 6.2 cases or more under these scenarios. Use of respiratory protection by the infectious source and/or susceptible passengers reduced infection incidence up to one order of magnitude.

  3. Role Of Social Networks In Resilience Of Naval Recruits: A Quantitative Analysis

    DTIC Science & Technology

    2016-06-01

    comprises 1,297 total surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network... surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network data examine the effects...NETWORKS IN RESILIENCE OF NAVAL RECRUITS: A QUANTITATIVE ANALYSIS by Andrea M. Watling June 2016 Thesis Advisor: Edward H. Powley Co

  4. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0109] NUREG/CR-XXXX, Development of Quantitative Software..., ``Development of Quantitative Software Reliability Models for Digital Protection Systems of Nuclear Power Plants... of Risk Analysis, Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission...

  5. Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.

    PubMed

    Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N

    2013-11-05

    Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting.

  6. Her-2/neu expression in node-negative breast cancer: direct tissue quantitation by computerized image analysis and association of overexpression with increased risk of recurrent disease.

    PubMed

    Press, M F; Pike, M C; Chazin, V R; Hung, G; Udove, J A; Markowicz, M; Danyluk, J; Godolphin, W; Sliwkowski, M; Akita, R

    1993-10-15

    The HER-2/neu proto-oncogene (also known as c-erb B-2) is homologous with, but distinct from, the epidermal growth factor receptor. Amplification of this gene in node-positive breast cancers has been shown to correlate with both earlier relapse and shorter overall survival. In node-negative breast cancer patients, the subgroup for which accurate prognostic data could make a significant contribution to treatment decisions, the prognostic utility of HER-2/neu amplification and/or overexpression has been controversial. The purpose of this report is to address the issues surrounding this controversy and to evaluate the prognostic utility of overexpression in a carefully followed group of patients using appropriately characterized reagents and methods. In this report we present data from a study of HER-2/neu expression designed specifically to test whether or not overexpression is associated with an increased risk of recurrence in node-negative breast cancers. From a cohort of 704 women with node-negative breast cancer who experienced recurrent disease (relapsed cases) 105 were matched with 105 women with no recurrence (disease-free controls) after the equivalent follow-up period. Immunohistochemistry was used to assess HER-2/neu expression in archival tissue blocks from both relapsed cases and their matched disease-free controls. Importantly, a series of molecularly characterized breast cancer specimens were used to confirm that the antibody used was of sufficient sensitivity and specificity to identify those cancers overexpressing the HER-2/neu protein in this formalin-fixed, paraffin-embedded tissue cohort. In addition, a quantitative approach was developed to more accurately assess the amount of HER-2/neu protein identified by immunostaining tumor tissue. This was done using a purified HER-2/neu protein synthesized in a bacterial expression vector and protein lysates derived from a series of cell lines, engineered to express a defined range of HER-2/neu oncoprotein

  7. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    PubMed

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  8. Quantitative analysis of diffusion tensor orientation: theoretical framework.

    PubMed

    Wu, Yu-Chien; Field, Aaron S; Chung, Moo K; Badie, Benham; Alexander, Andrew L

    2004-11-01

    Diffusion-tensor MRI (DT-MRI) yields information about the magnitude, anisotropy, and orientation of water diffusion of brain tissues. Although white matter tractography and eigenvector color maps provide visually appealing displays of white matter tract organization, they do not easily lend themselves to quantitative and statistical analysis. In this study, a set of visual and quantitative tools for the investigation of tensor orientations in the human brain was developed. Visual tools included rose diagrams, which are spherical coordinate histograms of the major eigenvector directions, and 3D scatterplots of the major eigenvector angles. A scatter matrix of major eigenvector directions was used to describe the distribution of major eigenvectors in a defined anatomic region. A measure of eigenvector dispersion was developed to describe the degree of eigenvector coherence in the selected region. These tools were used to evaluate directional organization and the interhemispheric symmetry of DT-MRI data in five healthy human brains and two patients with infiltrative diseases of the white matter tracts. In normal anatomical white matter tracts, a high degree of directional coherence and interhemispheric symmetry was observed. The infiltrative diseases appeared to alter the eigenvector properties of affected white matter tracts, showing decreased eigenvector coherence and interhemispheric symmetry. This novel approach distills the rich, 3D information available from the diffusion tensor into a form that lends itself to quantitative analysis and statistical hypothesis testing. (c) 2004 Wiley-Liss, Inc.

  9. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  11. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  12. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  13. A quantitative risk-assessment system (QR-AS) evaluating operation safety of Organic Rankine Cycle using flammable mixture working fluid.

    PubMed

    Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan

    2017-09-15

    Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Pitfalls and Precautions When Using Predicted Failure Data for Quantitative Analysis of Safety Risk for Human Rated Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Hatfield, Glen S.; Hark, Frank; Stott, James

    2016-01-01

    Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.

  15. Using multiscale texture and density features for near-term breast cancer risk analysis

    PubMed Central

    Sun, Wenqing; Tseng, Tzu-Liang (Bill); Qian, Wei; Zhang, Jianying; Saltzstein, Edward C.; Zheng, Bin; Lure, Fleming; Yu, Hui; Zhou, Shi

    2015-01-01

    Purpose: To help improve efficacy of screening mammography by eventually establishing a new optimal personalized screening paradigm, the authors investigated the potential of using the quantitative multiscale texture and density feature analysis of digital mammograms to predict near-term breast cancer risk. Methods: The authors’ dataset includes digital mammograms acquired from 340 women. Among them, 141 were positive and 199 were negative/benign cases. The negative digital mammograms acquired from the “prior” screening examinations were used in the study. Based on the intensity value distributions, five subregions at different scales were extracted from each mammogram. Five groups of features, including density and texture features, were developed and calculated on every one of the subregions. Sequential forward floating selection was used to search for the effective combinations. Using the selected features, a support vector machine (SVM) was optimized using a tenfold validation method to predict the risk of each woman having image-detectable cancer in the next sequential mammography screening. The area under the receiver operating characteristic curve (AUC) was used as the performance assessment index. Results: From a total number of 765 features computed from multiscale subregions, an optimal feature set of 12 features was selected. Applying this feature set, a SVM classifier yielded performance of AUC = 0.729 ± 0.021. The positive predictive value was 0.657 (92 of 140) and the negative predictive value was 0.755 (151 of 200). Conclusions: The study results demonstrated a moderately high positive association between risk prediction scores generated by the quantitative multiscale mammographic image feature analysis and the actual risk of a woman having an image-detectable breast cancer in the next subsequent examinations. PMID:26127038

  16. [Correspondence analysis between traditional commercial specifications and quantitative quality indices of Notopterygii Rhizoma et Radix].

    PubMed

    Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi

    2016-03-01

    This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.

  17. Dietary magnesium intake and risk of metabolic syndrome: a meta-analysis

    PubMed Central

    Dibaba, D. T.; Xun, P.; Fly, A. D.; Yokota, K.; He, K.

    2014-01-01

    Aims To estimate quantitatively the association between dietary magnesium intake and risk of metabolic syndrome by combining the relevant published articles using meta-analysis. Methods We reviewed the relevant literature in PubMed and EMBASE published up until August 2013 and obtained additional information through Google or a hand search of the references in relevant articles. A random-effects or fixed-effects model, as appropriate, was used to pool the effect sizes on metabolic syndrome comparing individuals with the highest dietary magnesium intake with those having the lowest intake. The dose–response relationship was assessed for every 100-mg/day increment in magnesium intake and risk of metabolic syndrome. Result Six cross-sectional studies, including a total of 24 473 individuals and 6311 cases of metabolic syndrome, were identified as eligible for the meta-analysis. A weighted inverse association was found between dietary magnesium intake and the risk of metabolic syndrome (odds ratio 0.69, 95% CI 0.59, 0.81) comparing the highest with the lowest group. For every 100-mg/day increment in magnesium intake, the overall risk of having metabolic syndrome was lowered by 17% (odds ratio 0.83, 95% CI 0. 77, 0.89). Conclusion Findings from the present meta-analysis suggest that dietary magnesium intake is inversely associated with the prevalence of metabolic syndrome. Further studies, in particular well-designed longitudinal cohort studies and randomized placebo-controlled clinical trials, are warranted to provide solid evidence and to establish causal inference. PMID:24975384

  18. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    PubMed

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p < 0.035). Indistinct margins and heterogeneous enhancement were independent predictors (AUC, 0.822). Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p < 0.05; AUC, 0.682-0.716). A combined quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  19. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  20. Quantitative Analysis of Repertoire-Scale Immunoglobulin Properties in Vaccine-Induced B-Cell Responses

    DTIC Science & Technology

    2017-05-10

    repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1

  1. Note of the methodological flaws in the paper entitled "Polymorphisms in IL-4/IL-13 pathway genes and glioma risk: an updated meta-analysis".

    PubMed

    Wang, Ting-Ting; Li, Jin-Mei; Zhou, Dong

    2016-01-01

    With great interest, we read the paper "Polymorphisms in IL-4/IL-13 pathway genes and glioma risk: an updated meta-analysis" (by Chen PQ et al.) [1], which has reached important conclusions about the relationship between polymorphisms in interleukin (IL)-4/IL-13 pathway genes and glioma risk. Through quantitative analysis, the meta-analysis found no association between IL-4/IL-13 pathway genetic polymorphisms and glioma risk (Chen et al. in Tumor Biol 36:121-127, 2015). The meta-analysis is the most comprehensive study of polymorphisms in the IL-4/IL-13 pathway and glioma risk. Nevertheless, some deficiencies still exist in this meta-analysis that we would like to raise.

  2. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    PubMed

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  3. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described.

  4. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  5. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  6. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, Michael; Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data

  7. Dietary fiber intake reduces risk of inflammatory bowel disease: result from a meta-analysis.

    PubMed

    Liu, Xiaoqin; Wu, Yili; Li, Fang; Zhang, Dongfeng

    2015-09-01

    Several epidemiological investigations have been conducted to evaluate the relationship between dietary fiber intake and inflammatory bowel diseases, but the results are inconsistent. This meta-analysis was performed to quantitatively summarize the evidence from observational studies. PubMed, Embase, and Web of Knowledge were searched for relevant articles published up to November 2014. The combined relative risks were calculated with the fixed- or random-effects model. Dose-response relationship was assessed using restricted cubic spline model. We hypothesized that the meta-analysis could yield a summary effect, which would indicate that dietary fiber intake could decrease the risk of ulcerative colitis and Crohn disease (CD). Overall, 8 articles involving 2 cohort studies, 1 nested case-control study, and 5 case-control studies were finally included in this study. The pooled relative risks with 95% confidence intervals of ulcerative colitis and CD for the highest vs lowest categories of dietary fiber intake were 0.80 (0.64-1.00) and 0.44 (0.29-0.69), respectively. A linear dose-response relationship was found between dietary fiber and CD risk, and the risk of CD decreased by 13% (P < .05) for every 10 g/d increment in fiber intake. The results from this meta-analysis indicated that the intake of dietary fiber was significantly associated with a decreased risk of inflammatory bowel disease. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis

    PubMed Central

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    ABSTRACT Introduction/Background: Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. Material and Methods: n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Results: Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Conclusion: Both Ki-67 and MCM-2 are

  9. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    PubMed

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  10. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  11. Meta-analysis on night shift work and risk of metabolic syndrome.

    PubMed

    Wang, F; Zhang, L; Zhang, Y; Zhang, B; He, Y; Xie, S; Li, M; Miao, X; Chan, E Y Y; Tang, J L; Wong, M C S; Li, Z; Yu, I T S; Tse, L A

    2014-09-01

    This study aims to quantitatively summarize the association between night shift work and the risk of metabolic syndrome (MetS), with special reference to the dose-response relationship with years of night shift work. We systematically searched all observational studies published in English on PubMed and Embase from 1971 to 2013. We extracted effect measures (relative risk, RR; or odd ratio, OR) with 95% confidence interval (CI) from individual studies to generate pooled results using meta-analysis approach. Pooled RR was calculated using random- or fixed-effect model. Downs and Black scale was applied to assess the methodological quality of included studies. A total of 13 studies were included. The pooled RR for the association between 'ever exposed to night shift work' and MetS risk was 1.57 (95% CI = 1.24-1.98, pheterogeneity  = 0.001), while a higher risk was indicated in workers with longer exposure to night shifts (RR = 1.77, 95% CI = 1.32-2.36, pheterogeneity  = 0.936). Further stratification analysis demonstrated a higher pooled effect of 1.84 (95% CI = 1.45-2.34) for studies using the NCEP-ATPIII criteria, among female workers (RR = 1.61, 95% CI = 1.10-2.34) and the countries other than Asia (RR = 1.65, 95% CI = 1.39-1.95). Sensitivity analysis confirmed the robustness of the results. No evidence of publication bias was detected. The present meta-analysis suggested that night shift work is significantly associated with the risk of MetS, and a positive dose-response relationship with duration of exposure was indicated. © 2014 The Authors. obesity reviews © 2014 World Obesity.

  12. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  13. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  14. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  15. Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.

    PubMed

    Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse

    2018-05-01

    Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.

  16. High-throughput quantitative analysis by desorption electrospray ionization mass spectrometry.

    PubMed

    Manicke, Nicholas E; Kistler, Thomas; Ifa, Demian R; Cooks, R Graham; Ouyang, Zheng

    2009-02-01

    A newly developed high-throughput desorption electrospray ionization (DESI) source was characterized in terms of its performance in quantitative analysis. A 96-sample array, containing pharmaceuticals in various matrices, was analyzed in a single run with a total analysis time of 3 min. These solution-phase samples were examined from a hydrophobic PTFE ink printed on glass. The quantitative accuracy, precision, and limit of detection (LOD) were characterized. Chemical background-free samples of propranolol (PRN) with PRN-d(7) as internal standard (IS) and carbamazepine (CBZ) with CBZ-d(10) as IS were examined. So were two other sample sets consisting of PRN/PRN-d(7) at varying concentration in a biological milieu of 10% urine or porcine brain total lipid extract, total lipid concentration 250 ng/microL. The background-free samples, examined in a total analysis time of 1.5 s/sample, showed good quantitative accuracy and precision, with a relative error (RE) and relative standard deviation (RSD) generally less than 3% and 5%, respectively. The samples in urine and the lipid extract required a longer analysis time (2.5 s/sample) and showed RSD values of around 10% for the samples in urine and 4% for the lipid extract samples and RE values of less than 3% for both sets. The LOD for PRN and CBZ when analyzed without chemical background was 10 and 30 fmol, respectively. The LOD of PRN increased to 400 fmol analyzed in 10% urine, and 200 fmol when analyzed in the brain lipid extract.

  17. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    PubMed

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Discriminative capacity of calcaneal quantitative ultrasound and of osteoporosis and fracture risk factors in postmenopausal women with osteoporotic fractures.

    PubMed

    Hernández, J L; Marin, F; González-Macías, J; Díez-Pérez, A; Vila, J; Giménez, S; Galán, B; Arenas, M S; Suárez, F; Gayola, L; Guillén, G; Sagredo, T; Belenguer, R; Moron, A; Arriaza, E

    2004-04-01

    Bone fragility fractures constitute the principal complication of osteoporosis. The identification of individuals at high risk of sustaining osteoporotic fractures is important for implementing preventive measures. The purpose of this study is to analyze the discriminative capacity of a series of osteoporosis and fracture risk factors, and of calcaneal quantitative ultrasound (QUS), in a population of postmenopausal women with a history of osteoporotic fracture. A cross-sectional analysis was made of a cohort of 5195 women aged 65 or older (mean +/- SD: 72.3 +/- 5.4 years) seen in 58 primary care centers in Spain. A total of 1042 women (20.1%) presented with a history of osteoporotic fracture. Most fractures (93%) were non-vertebral. Age-adjusted odds ratios corresponding to each decrease in one standard deviation of the different QUS parameters ranged from 1.47 to 1.55 (P < 0.001) for fractures. The age-adjusted multivariate analysis yielded the following risk factors independently associated with a history of osteoporotic fracture: number of fertile years, a family history of fracture, falls in the previous year, a history of chronic obstructive airway disease, the use of antiarrhythmic drugs, and a low value for any of the QUS parameters. The area under the receiver operating characteristic curve of the best model was 0.656. In summary, a series of easily assessable osteoporotic fracture risk factors has been identified. QUS was shown to discriminate between women with and without a history of fracture, and constitutes a useful tool for assessing fracture risk. Various of the vertebral and hip fracture risk factors frequently cited in North American and British populations showed no discriminative capacity in our series--thus suggesting that such factors may not be fully applicable to our population and/or to the predominant type of fractures included in the present study.

  19. Quantitative and Qualitative Assessment of Soil Erosion Risk in Małopolska (Poland), Supported by an Object-Based Analysis of High-Resolution Satellite Images

    NASA Astrophysics Data System (ADS)

    Drzewiecki, Wojciech; Wężyk, Piotr; Pierzchalski, Marcin; Szafrańska, Beata

    2014-06-01

    In 2011 the Marshal Office of Małopolska Voivodeship decided to evaluate the vulnerability of soils to water erosion for the entire region. The quantitative and qualitative assessment of the erosion risk for the soils of the Małopolska region was done based on the USLE approach. The special work-flow of geoinformation technologies was used to fulfil this goal. A high-resolution soil map, together with rainfall data, a detailed digital elevation model and statistical information about areas sown with particular crops created the input information for erosion modelling in GIS environment. The satellite remote sensing technology and the object-based image analysis (OBIA) approach gave valuable support to this study. RapidEye satellite images were used to obtain the essential up-to-date data about land use and vegetation cover for the entire region (15,000 km2). The application of OBIA also led to defining the direction of field cultivation and the mapping of contour tillage areas. As a result, the spatially differentiated values of erosion control practice factor were used. Both, the potential and the actual soil erosion risk were assessed quantificatively and qualitatively. The results of the erosion assessment in the Małopolska Voivodeship reveal the fact that a majority of its agricultural lands is characterized by moderate or low erosion risk levels. However, high-resolution erosion risk maps show its substantial spatial diversity. According to our study, average or higher actual erosion intensity levels occur for 10.6 % of agricultural land, i.e. 3.6 % of the entire voivodeship area. In 20 % of the municipalities there is a very urgent demand for erosion control. In the next 23 % an urgent erosion control is needed. Our study showed that even a slight improvement of P-factor estimation may have an influence on modeling results. In our case, despite a marginal change of erosion assessment figures on a regional scale, the influence on the final prioritization of

  20. Quantitative analysis of professionally trained versus untrained voices.

    PubMed

    Siupsinskiene, Nora

    2003-01-01

    The aim of this study was to compare healthy trained and untrained voices as well as healthy and dysphonic trained voices in adults using combined voice range profile and aerodynamic tests, to define the normal range limiting values of quantitative voice parameters and to select the most informative quantitative voice parameters for separation between healthy and dysphonic trained voices. Three groups of persons were evaluated. One hundred eighty six healthy volunteers were divided into two groups according to voice training: non-professional speakers group consisted of 106 untrained voices persons (36 males and 70 females) and professional speakers group--of 80 trained voices persons (21 males and 59 females). Clinical group consisted of 103 dysphonic professional speakers (23 males and 80 females) with various voice disorders. Eighteen quantitative voice parameters from combined voice range profile (VRP) test were analyzed: 8 of voice range profile, 8 of speaking voice, overall vocal dysfunction degree and coefficient of sound, and aerodynamic maximum phonation time. Analysis showed that healthy professional speakers demonstrated expanded vocal abilities in comparison to healthy non-professional speakers. Quantitative voice range profile parameters- pitch range, high frequency limit, area of high frequencies and coefficient of sound differed significantly between healthy professional and non-professional voices, and were more informative than speaking voice or aerodynamic parameters in showing the voice training. Logistic stepwise regression revealed that VRP area in high frequencies was sufficient to discriminate between healthy and dysphonic professional speakers for male subjects (overall discrimination accuracy--81.8%) and combination of three quantitative parameters (VRP high frequency limit, maximum voice intensity and slope of speaking curve) for female subjects (overall model discrimination accuracy--75.4%). We concluded that quantitative voice assessment

  1. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    ERIC Educational Resources Information Center

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  2. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  3. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  4. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  5. Electroencephalography reactivity for prognostication of post-anoxic coma after cardiopulmonary resuscitation: A comparison of quantitative analysis and visual analysis.

    PubMed

    Liu, Gang; Su, Yingying; Jiang, Mengdi; Chen, Weibi; Zhang, Yan; Zhang, Yunzhou; Gao, Daiquan

    2016-07-28

    Electroencephalogram reactivity (EEG-R) is a positive predictive factor for assessing outcomes in comatose patients. Most studies assess the prognostic value of EEG-R utilizing visual analysis; however, this method is prone to subjectivity. We sought to categorize EEG-R with a quantitative approach. We retrospectively studied consecutive comatose patients who had an EEG-R recording performed 1-3 days after cardiopulmonary resuscitation (CPR) or during normothermia after therapeutic hypothermia. EEG-R was assessed via visual analysis and quantitative analysis separately. Clinical outcomes were followed-up at 3-month and dichotomized as recovery of awareness or no recovery of awareness. A total of 96 patients met the inclusion criteria, and 38 (40%) patients recovered awareness at 3-month followed-up. Of 27 patients with EEG-R measured with visual analysis, 22 patients recovered awareness; and of the 69 patients who did not demonstrated EEG-R, 16 patients recovered awareness. The sensitivity and specificity of visually measured EEG-R were 58% and 91%, respectively. The area under the receiver operating characteristic curve for the quantitative analysis was 0.92 (95% confidence interval, 0.87-0.97), with the best cut-off value of 0.10. EEG-R through quantitative analysis might be a good method in predicting the recovery of awareness in patients with post-anoxic coma after CPR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Oral contraceptives and colorectal cancer risk: a meta-analysis

    PubMed Central

    Fernandez, E; Vecchia, C La; Balducci, A; Chatenoud, L; Franceschi, S; Negri, E

    2001-01-01

    Several studies have suggested an inverse association between use of combined oral contraceptives (OC) and the risk of colorectal cancer and here we present a meta-analysis of published studies. Articles considered were epidemiological studies published as full papers in English up to June 2000 that included quantitative information on OC use. The pooled relative risks (RR) of colorectal cancer for ever OC use from the 8 case-control studies was 0.81 (95% confidence interval (CI): 0.69–0.94), and the pooled estimate from the 4 cohort studies was 0.84 (95% CI: 0.72–0.97). The pooled estimate from all studies combined was 0.82 (95% CI: 0.74–0.92), without apparent heterogeneity. Duration of use was not associated with a decrease in risk, but there was some indication that the apparent protection was stronger for women who had used OCs more recently (RR = 0.46; 95% CI: 0.30–0.71). A better understanding of this potential relation may help informed choice of contraception. © 2001 Cancer Research Campaign http://www.bjcancer.com PMID:11237397

  7. Status of risk-benefit analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Horn, A.J.; Wilson, R.

    1976-12-01

    The benefits and deficiencies of cost benefit analysis are reviewed. It is pointed out that, if decision making involving risks and benefits is to improve, more attention must be paid to the clear presentation of the assumptions, values, and results. Reports need to present concise summaries which convey the uncertainties and limitations of the analysis in addition to the matrix of costs, risks, and benefits. As the field of risk-benefit analysis advances the estimation of risks and benefits will become more precise and implicit valuations will be made more explicit. Corresponding improvements must also be made to enhance communications betweenmore » the risk-benefit analyst and the accountable decision maker.« less

  8. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Agreement between quantitative microbial risk assessment and epidemiology at low doses during waterborne outbreaks of protozoan disease

    USDA-ARS?s Scientific Manuscript database

    Quantitative microbial risk assessment (QMRA) is a valuable complement to epidemiology for understanding the health impacts of waterborne pathogens. The approach works by extrapolating available data in two ways. First, dose-response data are typically extrapolated from feeding studies, which use ...

  10. An Analysis of Enterprise Risk Management and IT Effectiveness Constructs

    ERIC Educational Resources Information Center

    Waithe, Errol

    2016-01-01

    One major problem many organizations are facing is balancing the risk-management practices of the organization with overall information technology (IT) effectiveness. The purpose of this non-experimental quantitative correlational study was to assess the constructs and correlations associated with enterprise risk management and IT effectiveness.…

  11. Quantitative analysis of eyes and other optical systems in linear optics.

    PubMed

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  12. Quantitative 3D breast magnetic resonance imaging fibroglandular tissue analysis and correlation with qualitative assessments: a feasibility study.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-04-01

    The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized

  13. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  14. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    PubMed

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  15. Quantitative Method for Analyzing the Allocation of Risks in Transportation Construction

    DOT National Transportation Integrated Search

    1979-04-01

    The report presents a conceptual model of risk that was developed to analyze the impact on owner's cost of alternate allocations of risk among owner and contractor in mass transit construction. A model and analysis procedure are developed, based on d...

  16. Quantitative Polymerase Chain Reaction to Assess Response to Treatment of Bacterial Vaginosis and Risk of Preterm Birth.

    PubMed

    Abramovici, Adi; Lobashevsky, Elena; Cliver, Suzanne P; Edwards, Rodney K; Hauth, John C; Biggio, Joseph R

    2015-10-01

    The aim of this study was to determine whether quantitative polymerase chain reaction (qPCR) bacterial load measurement is a valid method to assess response to treatment of bacterial vaginosis and risk of preterm birth in pregnant women. Secondary analysis by utilizing stored vaginal samples obtained during a previous randomized controlled trial studying the effect of antibiotics on preterm birth (PTB). All women had risk factors for PTB: (1) positive fetal fibronectin (n=146), (2) bacterial vaginosis (BV) and a prior PTB (n=43), or (3) BV and a prepregnancy weight<50 kg (n=54). Total and several individual BV-related bacteria loads were measured using qPCR for 16S rRNA. Loads were correlated with Nugent scores (Spearman correlation coefficients). Loads were compared pre- and posttreatment with Wilcoxon rank-sum test. Individual patient differences were examined with Wilcoxon signed-rank test. A total of 243 paired vaginal samples were available for analysis: 123 antibiotics and 120 placebo. Groups did not differ by risk factors for PTB. For all samples, bacterial loads were correlated with Nugent score and each of its specific bacterial components (all p<0.01). Baseline total bacterial load did not differ by treatment group (p=0.87). Posttreatment total bacterial load was significantly lower in the antibiotics group than the placebo group (p<0.01). Individual patient total bacterial load decreased significantly posttreatment in the antibiotics group (p<0.01), but not in the placebo group (p=0.12). The rate of PTB did not differ between groups (p=0.24). PTB relative risks calculated for BV positive versus BV negative women and women with the highest quartile total and individual bacterial loads were not statistically significant. qPCR correlates with Nugent score and demonstrates decreased bacterial load after antibiotic treatment. Therefore, it is a valid method of vaginal flora assessment in pregnant women who are at high risk for PTB. Thieme Medical Publishers

  17. Quantitative landslide risk assessment and mapping on the basis of recent occurrences

    NASA Astrophysics Data System (ADS)

    Remondo, Juan; Bonachea, Jaime; Cendrero, Antonio

    A quantitative procedure for mapping landslide risk is developed from considerations of hazard, vulnerability and valuation of exposed elements. The approach based on former work by the authors, is applied in the Bajo Deba area (northern Spain) where a detailed study of landslide occurrence and damage in the recent past (last 50 years) was carried out. Analyses and mapping are implemented in a Geographic Information System (GIS). The method is based on a susceptibility model developed previously from statistical relationships between past landslides and terrain parameters related to instability. Extrapolations based on past landslide behaviour were used to calculate failure frequency for the next 50 years. A detailed inventory of direct damage due to landslides during the study period was carried out and the main elements at risk in the area identified and mapped. Past direct (monetary) losses per type of element were estimated and expressed as an average 'specific loss' for events of a given magnitude (corresponding to a specified scenario). Vulnerability was assessed by comparing losses with the actual value of the elements affected and expressed as a fraction of that value (0-1). From hazard, vulnerability and monetary value, risk was computed for each element considered. Direct risk maps (€/pixel/year) were obtained and indirect losses from the disruption of economic activities due to landslides assessed. The final result is a risk map and table combining all losses per pixel for a 50-year period. Total monetary value at risk for the Bajo Deba area in the next 50 years is about 2.4 × 10 6 Euros.

  18. Enabling More than Moore: Accelerated Reliability Testing and Risk Analysis for Advanced Electronics Packaging

    NASA Technical Reports Server (NTRS)

    Ghaffarian, Reza; Evans, John W.

    2014-01-01

    For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.

  19. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  20. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    PubMed Central

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  1. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  2. Quantitative analysis of cardiovascular MR images.

    PubMed

    van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H

    1997-06-01

    The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.

  3. Quantitative CT analysis of honeycombing area in idiopathic pulmonary fibrosis: Correlations with pulmonary function tests.

    PubMed

    Nakagawa, Hiroaki; Nagatani, Yukihiro; Takahashi, Masashi; Ogawa, Emiko; Tho, Nguyen Van; Ryujin, Yasushi; Nagao, Taishi; Nakano, Yasutaka

    2016-01-01

    The 2011 official statement of idiopathic pulmonary fibrosis (IPF) mentions that the extent of honeycombing and the worsening of fibrosis on high-resolution computed tomography (HRCT) in IPF are associated with the increased risk of mortality. However, there are few reports about the quantitative computed tomography (CT) analysis of honeycombing area. In this study, we first proposed a computer-aided method for quantitative CT analysis of honeycombing area in patients with IPF. We then evaluated the correlations between honeycombing area measured by the proposed method with that estimated by radiologists or with parameters of PFTs. Chest HRCTs and pulmonary function tests (PFTs) of 36 IPF patients, who were diagnosed using HRCT alone, were retrospectively evaluated. Two thoracic radiologists independently estimated the honeycombing area as Identified Area (IA) and the percentage of honeycombing area to total lung area as Percent Area (PA) on 3 axial CT slices for each patient. We also developed a computer-aided method to measure the honeycombing area on CT images of those patients. The total honeycombing area as CT honeycombing area (HA) and the percentage of honeycombing area to total lung area as CT %honeycombing area (%HA) were derived from the computer-aided method for each patient. HA derived from three CT slices was significantly correlated with IA (ρ=0.65 for Radiologist 1 and ρ=0.68 for Radiologist 2). %HA derived from three CT slices was also significantly correlated with PA (ρ=0.68 for Radiologist 1 and ρ=0.70 for Radiologist 2). HA and %HA derived from all CT slices were significantly correlated with FVC (%pred.), DLCO (%pred.), and the composite physiologic index (CPI) (HA: ρ=-0.43, ρ=-0.56, ρ=0.63 and %HA: ρ=-0.60, ρ=-0.49, ρ=0.69, respectively). The honeycombing area measured by the proposed computer-aided method was correlated with that estimated by expert radiologists and with parameters of PFTs. This quantitative CT analysis of

  4. Correlation of genetic risk and messenger RNA expression in a Th17/IL23 pathway analysis in inflammatory bowel disease.

    PubMed

    Fransen, Karin; van Sommeren, Suzanne; Westra, Harm-Jan; Veenstra, Monique; Lamberts, Letitia E; Modderman, Rutger; Dijkstra, Gerard; Fu, Jingyuan; Wijmenga, Cisca; Franke, Lude; Weersma, Rinse K; van Diemen, Cleo C

    2014-05-01

    The Th17/IL23 pathway has both genetically and biologically been implicated in the pathogenesis of the inflammatory bowel diseases (IBD), Crohn's disease, and ulcerative colitis. So far, it is unknown whether and how associated risk variants affect expression of the genes encoding for Th17/IL23 pathway proteins. Ten IBD-associated SNPs residing near Th17/IL23 genes were used to construct a genetic risk model in 753 Dutch IBD cases and 1045 controls. In an independent cohort of 40 Crohn's disease, 40 ulcerative colitis, and 40 controls, the genetic risk load and presence of IBD were correlated to quantitative PCR-generated messenger RNA (mRNA) expression of 9 representative Th17/IL23 genes in both unstimulated and PMA/CaLo stimulated peripheral blood mononuclear cells. In 1240 individuals with various immunological diseases with whole genome genotype and mRNA-expression data, we also assessed correlation between genetic risk load and differential mRNA expression and sought for SNPs affecting expression of all currently known Th17/IL23 pathway genes (cis-expression quantitative trait locus). The presence of IBD, but not the genetic risk load, was correlated to differential mRNA expression for IL6 in unstimulated peripheral blood mononuclear cells and to IL23A and RORC in response to stimulation. The cis-expression quantitative trait locus analysis showed little evidence for correlation between genetic risk load and mRNA expression of Th17/IL23 genes, because we identified for only 2 of 22 Th17/IL23 genes a cis-expression quantitative trait locus single nucleotide polymorphism that is also associated to IBD (STAT3 and CCR6). Our results suggest that only the presence of IBD and not the genetic risk load alters mRNA expression levels of IBD-associated Th17/IL23 genes.

  5. Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.

    PubMed

    Coleman, Priscilla K

    2011-09-01

    Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.

  6. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  7. Use of antipsychotics increases the risk of fracture: a systematic review and meta-analysis.

    PubMed

    Lee, S-H; Hsu, W-T; Lai, C-C; Esmaily-Fard, A; Tsai, Y-W; Chiu, C-C; Wang, J; Chang, S-S; Lee, C C

    2017-04-01

    Our systematic review and meta-analysis of observational studies indicated that the use of antipsychotics was associated with a nearly 1.5-fold increase in the risk of fracture. First-generation antipsychotics (FGAs) appeared to carry a higher risk of fracture than second-generation antipsychotics (SGAs). The risk of fractures associated with the use of antipsychotic medications has inconsistent evidence between different drug classes. A systematic review and meta-analysis was conducted to evaluate whether there is an association between the use of antipsychotic drugs and fractures. Searches were conducted through the PubMed and EMBASE databases to identify observational studies that had reported a quantitative estimate of the association between use of antipsychotics and fractures. The summary risk was derived from random effects meta-analysis. The search yielded 19 observational studies (n = 544,811 participants) with 80,835 fracture cases. Compared with nonuse, use of FGAs was associated with a significantly higher risk for hip fractures (OR 1.67, 95% CI, 1.45-1.93), and use of second generation antipsychotics (SGAs) was associated with an attenuated but still significant risk for hip fractures (OR 1.33, 95% CI, 1.11-1.58). The risk of fractures associated with individual classes of antipsychotic users was heterogeneous, and odds ratios ranged from 1.24 to 2.01. Chlorpromazine was associated with the highest risk (OR 2.01, 95% CI 1.43-2.83), while Risperidone was associated with the lowest risk of fracture (OR 1.24, 95% CI 0.95-1.83). FGA users were at a higher risk of hip fracture than SGA users. Both FGAs and SGAs were associated with an increased risk of fractures, especially among the older population. Therefore, the benefit of the off-label use of antipsychotics in elderly patients should be weighed against any risks for fracture.

  8. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    PubMed

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  9. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data*

    PubMed Central

    Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy

    2011-01-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510

  10. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... on the sources of L. monocytogenes contamination, the effects of individual manufacturing and/or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1182] Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

  11. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    PubMed

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  13. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  14. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  15. Human-Associated Fecal Quantitative Polymerase Chain ReactionMeasurements and Simulated Risk of Gastrointestinal Illness in Recreational Waters Contaminated with Raw Sewage

    EPA Science Inventory

    We used quantitative microbial risk assessment (QMRA) to estimate the risk of gastrointestinal (GI) illness associated with swimming in recreational waters containing different concentrations of human-associated fecal qPCR markers from raw sewage– HF183 and HumM2. The volume/volu...

  16. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  17. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  18. Revised Risk Priority Number in Failure Mode and Effects Analysis Model from the Perspective of Healthcare System

    PubMed Central

    Rezaei, Fatemeh; Yarmohammadian, Mohmmad H.; Haghshenas, Abbas; Fallah, Ali; Ferdosi, Masoud

    2018-01-01

    Background: Methodology of Failure Mode and Effects Analysis (FMEA) is known as an important risk assessment tool and accreditation requirement by many organizations. For prioritizing failures, the index of “risk priority number (RPN)” is used, especially for its ease and subjective evaluations of occurrence, the severity and the detectability of each failure. In this study, we have tried to apply FMEA model more compatible with health-care systems by redefining RPN index to be closer to reality. Methods: We used a quantitative and qualitative approach in this research. In the qualitative domain, focused groups discussion was used to collect data. A quantitative approach was used to calculate RPN score. Results: We have studied patient's journey in surgery ward from holding area to the operating room. The highest priority failures determined based on (1) defining inclusion criteria as severity of incident (clinical effect, claim consequence, waste of time and financial loss), occurrence of incident (time - unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) then, (2) risks priority criteria quantified by using RPN index (361 for the highest rate failure). The ability of improved RPN scores reassessed by root cause analysis showed some variations. Conclusions: We concluded that standard criteria should be developed inconsistent with clinical linguistic and special scientific fields. Therefore, cooperation and partnership of technical and clinical groups are necessary to modify these models. PMID:29441184

  19. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    PubMed

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  20. Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents

    ERIC Educational Resources Information Center

    Cochran, Beverly; Lunday, Deborah; Miskevich, Frank

    2008-01-01

    Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…

  1. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm

  2. Towards a better reliability of risk assessment: development of a qualitative & quantitative risk evaluation model (Q2REM) for different trades of construction works in Hong Kong.

    PubMed

    Fung, Ivan W H; Lo, Tommy Y; Tung, Karen C F

    2012-09-01

    Since the safety professionals are the key decision makers dealing with project safety and risk assessment in the construction industry, their perceptions of safety risk would directly affect the reliability of risk assessment. The safety professionals generally tend to heavily rely on their own past experiences to make subjective decisions on risk assessment without systematic decision making. Indeed, understanding of the underlying principles of risk assessment is significant. In this study, the qualitative analysis on the safety professionals' beliefs of risk assessment and their perceptions towards risk assessment, including their recognitions of possible accident causes, the degree of differentiations on their perceptions of risk levels of different trades of works, recognitions of the occurrence of different types of accidents, and their inter-relationships with safety performance in terms of accident rates will be explored in the Stage 1. At the second stage, the deficiencies of the current general practice for risk assessment can be sorted out firstly. Based on the findings from Stage 1 and the historical accident data from 15 large-scaled construction projects in 3-year average, a risk evaluation model prioritizing the risk levels of different trades of works and which cause different types of site accident due to various accident causes will be developed quantitatively. With the suggested systematic accident recording techniques, this model can be implemented in the construction industry at both project level and organizational level. The model (Q(2)REM) not only act as a useful supplementary guideline of risk assessment for the construction safety professionals, but also assists them to pinpoint the potential risks on site for the construction workers under respective trades of works through safety trainings and education. It, in turn, arouses their awareness on safety risk. As the Q(2)REM can clearly show the potential accident causes leading to

  3. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    PubMed

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  4. A meta-analysis including dose-response relationship between night shift work and the risk of colorectal cancer.

    PubMed

    Wang, Xiao; Ji, Alin; Zhu, Yi; Liang, Zhen; Wu, Jian; Li, Shiqi; Meng, Shuai; Zheng, Xiangyi; Xie, Liping

    2015-09-22

    A meta-analysis was conducted to quantitatively evaluate the correlation between night shift work and the risk of colorectal cancer. We searched for publications up to March 2015 using PubMed, Web of Science, Cochrane Library, EMBASE and the Chinese National Knowledge Infrastructure databases, and the references of the retrieved articles and relevant reviews were also checked. OR and 95% CI were used to assess the degree of the correlation between night shift work and risk of colorectal cancer via fixed- or random-effect models. A dose-response meta-analysis was performed as well. The pooled OR estimates of the included studies illustrated that night shift work was correlated with an increased risk of colorectal cancer (OR = 1.318, 95% CI 1.121-1.551). No evidence of publication bias was detected. In the dose-response analysis, the rate of colorectal cancer increased by 11% for every 5 years increased in night shift work (OR = 1.11, 95% CI 1.03-1.20). In conclusion, this meta-analysis indicated that night shift work was associated with an increased risk of colorectal cancer. Further researches should be conducted to confirm our findings and clarify the potential biological mechanisms.

  5. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  6. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    PubMed

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  7. A decision analysis approach for risk management of near-earth objects

    NASA Astrophysics Data System (ADS)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  8. A clustering approach to segmenting users of internet-based risk calculators.

    PubMed

    Harle, C A; Downs, J S; Padman, R

    2011-01-01

    Risk calculators are widely available Internet applications that deliver quantitative health risk estimates to consumers. Although these tools are known to have varying effects on risk perceptions, little is known about who will be more likely to accept objective risk estimates. To identify clusters of online health consumers that help explain variation in individual improvement in risk perceptions from web-based quantitative disease risk information. A secondary analysis was performed on data collected in a field experiment that measured people's pre-diabetes risk perceptions before and after visiting a realistic health promotion website that provided quantitative risk information. K-means clustering was performed on numerous candidate variable sets, and the different segmentations were evaluated based on between-cluster variation in risk perception improvement. Variation in responses to risk information was best explained by clustering on pre-intervention absolute pre-diabetes risk perceptions and an objective estimate of personal risk. Members of a high-risk overestimater cluster showed large improvements in their risk perceptions, but clusters of both moderate-risk and high-risk underestimaters were much more muted in improving their optimistically biased perceptions. Cluster analysis provided a unique approach for segmenting health consumers and predicting their acceptance of quantitative disease risk information. These clusters suggest that health consumers were very responsive to good news, but tended not to incorporate bad news into their self-perceptions much. These findings help to quantify variation among online health consumers and may inform the targeted marketing of and improvements to risk communication tools on the Internet.

  9. Quantitative Criteria to Screen for Cannabis Use Disorder.

    PubMed

    Casajuana, Cristina; López-Pelayo, Hugo; Miquel, Laia; Balcells-Oliveró, María Mercedes; Colom, Joan; Gual, Antoni

    2018-06-27

    The Standard Joint Unit (1 SJU = 7 mg of 9-Tetrahydrocannabinol) simplifies the exploration of risky patterns of cannabis use. This study proposes a preliminary quantitative cutoff criterion to screen for cannabis use disorder (CUD). Socio-demographical data and information on cannabis quantities, frequency of use, and risk for CUD (measured with the Cannabis Abuse Screening Test (CAST) of cannabis users recruited in Barcelona (from February 2015 to June 2016) were collected. CAST scores were categorized into low, moderate, and high risk for CUD, based on the SJU consumed and frequency. Receiver operating characteristic (ROC) analysis related daily SJU with CUD. Participants (n = 473) were on average 29 years old (SD = 10), men (77.1%), and single (74.6%). With an average of 4 joints per smoking day, 82.5% consumed cannabis almost every day. Risk for CUD (9.40% low, 23.72% moderate, 66.88% high) increased significantly with more frequency and quantities consumed. The ROC analyses suggest 1.2 SJU per day as a cutoff criterion to screen for at least moderate risk for CUD (sensitivity 69.4%, specificity 63.6%). Frequency and quantity should be considered when exploring cannabis risks. A 1 SJU per day is proposed as a preliminary quantitative-based criterion to screen users with at least a moderate risk for CUD. © 2018 S. Karger AG, Basel.

  10. Advancing effects analysis for integrated, large-scale wildfire risk assessment

    Treesearch

    Matthew P. Thompson; David E. Calkin; Julie W. Gilbertson-Day; Alan A. Ager

    2011-01-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both...

  11. Using a quantitative risk register to promote learning from a patient safety reporting system.

    PubMed

    Mansfield, James G; Caplan, Robert A; Campos, John S; Dreis, David F; Furman, Cathie

    2015-02-01

    Patient safety reporting systems are now used in most health care delivery organizations. These systems, such as the one in use at Virginia Mason (Seattle) since 2002, can provide valuable reports of risk and harm from the front lines of patient care. In response to the challenge of how to quantify and prioritize safety opportunities, a risk register system was developed and implemented. Basic risk register concepts were refined to provide a systematic way to understand risks reported by staff. The risk register uses a comprehensive taxonomy of patient risk and algorithmically assigns each patient safety report to 1 of 27 risk categories in three major domains (Evaluation, Treatment, and Critical Interactions). For each category, a composite score was calculated on the basis of event rate, harm, and cost. The composite scores were used to identify the "top five" risk categories, and patient safety reports in these categories were analyzed in greater depth to find recurrent patterns of risk and associated opportunities for improvement. The top five categories of risk were easy to identify and had distinctive "profiles" of rate, harm, and cost. The ability to categorize and rank risks across multiple dimensions yielded insights not previously available. These results were shared with leadership and served as input for planning quality and safety initiatives. This approach provided actionable input for the strategic planning process, while at the same time strengthening the Virginia Mason culture of safety. The quantitative patient safety risk register serves as one solution to the challenge of extracting valuable safety lessons from large numbers of incident reports and could profitably be adopted by other organizations.

  12. Coffee drinking and pancreatic cancer risk: a meta-analysis of cohort studies.

    PubMed

    Dong, Jie; Zou, Jian; Yu, Xiao-Feng

    2011-03-07

    To quantitatively assess the relationship between coffee consumption and incidence of pancreatic cancer in a meta-analysis of cohort studies. We searched MEDLINE, EMBASE, Science Citation Index Expanded and bibliographies of retrieved articles. Studies were included if they reported relative risks (RRs) and corresponding 95% CIs of pancreatic cancer with respect to frequency of coffee intake. We performed random-effects meta-analyses and meta-regressions of study-specific incremental estimates to determine the risk of pancreatic cancer associated with a 1 cup/d increment in coffee consumption. Fourteen studies met the inclusion criteria, which included 671,080 individuals (1496 cancer events) with an average follow-up of 14.9 years. Compared with individuals who did not drink or seldom drank coffee per day, the pooled RR of pancreatic cancer was 0.82 (95% CI: 0.69-0.95) for regular coffee drinkers, 0.86 (0.76-0.96) for low to moderate coffee drinkers, and 0.68 (0.51-0.84) for high drinkers. In subgroup analyses, we noted that, coffee drinking was associated with a reduced risk of pancreatic cancer in men, while this association was not seen in women. These associations were also similar in studies from North America, Europe, and the Asia-Pacific region. Findings from this meta-analysis suggest that there is an inverse relationship between coffee drinking and risk of pancreatic cancer.

  13. Quantitative risk assessment of the introduction of rabies into Japan through the importation of dogs and cats worldwide.

    PubMed

    Kwan, N C L; Sugiura, K; Hosoi, Y; Yamada, A; Snary, E L

    2017-04-01

    Japan has been free from rabies since 1958. A strict import regimen has been adopted since 2004 consisting of identification of an animal with microchip, two-time rabies vaccination, neutralizing antibody titration test and a waiting period of 180 days. The present study aims to quantitatively assess the risk of rabies introduction into Japan through the international importation of dogs and cats and hence provide evidence-based recommendations to strengthen the current rabies prevention system. A stochastic scenario tree model was developed and simulations were run using @RISK. The probability of infection in a single dog or cat imported into Japan is estimated to be 2·16 × 10-9 [90% prediction interval (PI) 6·65 × 10-11-6·48 × 10-9]. The number of years until the introduction of a rabies case is estimated to be 49 444 (90% PI 19 170-94 641) years. The current import regimen is effective in maintaining the very low risk of rabies introduction into Japan and responding to future changes including increases in import level and rabies prevalence in the world. However, non-compliance or smuggling activities could substantially increase the risk of rabies introduction. Therefore, policy amendment which could promote compliance is highly recommended. Scenario analysis demonstrated that the waiting period could be reduced to 90 days and the requirement for vaccination could be reduced to a single vaccination, but serological testing should not be stopped.

  14. Risk manager formula for success: Influencing decision making.

    PubMed

    Midgley, Mike

    2017-10-01

    Providing the ultimate decision makers with a quantitative risk analysis based on thoughtful assessment by the organization's experts enables an efficient decision. © 2017 American Society for Healthcare Risk Management of the American Hospital Association.

  15. Poultry and fish intake and risk of esophageal cancer: A meta-analysis of observational studies.

    PubMed

    Jiang, Gengxi; Li, Bailing; Liao, Xiaohong; Zhong, Chongjun

    2016-03-01

    Mixed results regarding the association between white meat (including poultry and fish) intake and the risk of esophageal cancer (EC) have been reported. We performed a meta-analysis to provide a quantitative assessment of this association. Relevant studies were identified in MEDLINE until December 31, 2012. Summary relative risks (SRRs) with 95% confidence intervals (CIs) were pooled with a random-effects model. A total of 20 articles, including 3990 cases with EC, were included in this meta-analysis. Compared to individuals with the lowest level of fish intake, individuals with the highest fish intake were found to have reduced risk of EC (SRRs = 0.69; 95% CIs: 0.57-0.85), while poultry intake was not associated with EC (SRRs = 0.83; 95% CIs: 0.62-1.12). Total fish consumption is associated with reduced esophageal squamous cell carcinoma (ESCC) risk, while poultry consumption was not associated with ESCC risk. Additionally, neither poultry nor fish consumption was associated with esophageal adenocarcinoma risk. Our results suggest that fish consumption may have a potential role in EC prevention, while poultry intake has no effect. However, because the majority of data was from case-control studies, further well-designed prospective studies are warranted. © 2013 Wiley Publishing Asia Pty Ltd.

  16. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  17. Quantitative genetic tools for insecticide resistance risk assessment: estimating the heritability of resistance

    Treesearch

    Michael J. Firko; Jane Leslie Hayes

    1990-01-01

    Quantitative genetic studies of resistance can provide estimates of genetic parameters not available with other types of genetic analyses. Three methods are discussed for estimating the amount of additive genetic variation in resistance to individual insecticides and subsequent estimation of heritability (h2) of resistance. Sibling analysis and...

  18. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    NASA Astrophysics Data System (ADS)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  19. Quantitative analysis of [99mTc]C2A-GST distribution in the area at risk after myocardial ischemia and reperfusion using a compartmental model.

    PubMed

    Audi, Said; Poellmann, Michael; Zhu, Xiaoguang; Li, Zhixin; Zhao, Ming

    2007-11-01

    It was recently demonstrated that the radiolabeled C2A domain of synaptotagmin I accumulates avidly in the area at risk after ischemia and reperfusion. The objective was to quantitatively characterize the dynamic uptake of radiolabeled C2A in normal and ischemically injured myocardia using a compartmental model. To induce acute myocardial infarction, the left descending coronary artery was ligated for 18 min, followed by reperfusion. [99mTc]C2A-GST or its inactivated form, [99mTc]C2A-GST-NHS, was injected intravenously at 2 h after reperfusion. A group of four rats was sacrificed at 10, 30, 60 and 180 after injection. Uptake of [99mTc]C2A-GST and [99mTc]C2A-GST-NHS in the area at risk and in the normal myocardium were determined by gamma counting. A compartmental model was developed to quantitatively interpret myocardial uptake kinetic data. The model consists of two physical spaces (vascular space and tissue space), with plasma activity as input. The model allows for [99mTc]C2A-GST and [99mTc]C2A-GST-NHS diffusion between vascular and tissue spaces, as well as for [99mTc]C2A-GST sequestration in vascular and tissue spaces via specific binding. [99mTc]C2A-GST uptake in the area at risk was significantly higher than that for [99mTc]C2A-GST-NHS at all time points. The compartmental model separated [99mTc]C2A-GST uptake in the area at risk due to passive retention from that due to specific binding. The maximum amount of [99mTc]C2A-GST that could be sequestered in the area at risk due to specific binding was estimated at a total of 0.048 nmol/g tissue. The rate of [99mTc]C2A-GST sequestration within the tissue space of the area at risk was 0.012 ml/min. Modeling results also revealed that the diffusion rate of radiotracer between vascular and tissue spaces is the limiting factor of [99mTc]C2A-GST sequestration within the tissue space of the area at risk. [99mTc]C2A-GST is sequestered in the ischemically injured myocardium in a well-defined dynamic profile. Model

  20. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  1. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  2. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  3. A model of pathways to artificial superintelligence catastrophe for risk and decision analysis

    NASA Astrophysics Data System (ADS)

    Barrett, Anthony M.; Baum, Seth D.

    2017-03-01

    An artificial superintelligence (ASI) is an artificial intelligence that is significantly more intelligent than humans in all respects. Whilst ASI does not currently exist, some scholars propose that it could be created sometime in the future, and furthermore that its creation could cause a severe global catastrophe, possibly even resulting in human extinction. Given the high stakes, it is important to analyze ASI risk and factor the risk into decisions related to ASI research and development. This paper presents a graphical model of major pathways to ASI catastrophe, focusing on ASI created via recursive self-improvement. The model uses the established risk and decision analysis modelling paradigms of fault trees and influence diagrams in order to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. The events and conditions include select aspects of the ASI itself as well as the human process of ASI research, development and management. Model structure is derived from published literature on ASI risk. The model offers a foundation for rigorous quantitative evaluation and decision-making on the long-term risk of ASI catastrophe.

  4. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  5. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  6. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  7. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  8. Quantitative Trait Locus Analysis of SIX1-SIX6 with Retinal Nerve Fiber Layer Thickness in Individuals of European Descent

    PubMed Central

    Kuo, Jane Z.; Zangwill, Linda M.; Medeiros, Felipe A.; Liebmann, Jeffery M.; Girkin, Christopher A.; Hammel, Na’ama; Rotter, Jerome I.; Weinreb, Robert N.

    2015-01-01

    Purpose To perform a quantitative trait locus (QTL) analysis and evaluate whether a locus between SIX1 and SIX6 is associated with retinal nerve fiber layer (RNFL) thickness in individuals of European descent. Design Observational, multi-center, cross-sectional study. Methods 231 participants were recruited from the Diagnostic Innovations in Glaucoma Study and the African Descent and Glaucoma Evaluation Study. Association of rs10483727 in SIX1-SIX6 with global and sectoral RNFL thickness was performed. Quantitative trait analysis with the additive model of inheritance was analyzed using linear regression. Trend analysis was performed to evaluate the mean global and sectoral RNFL thickness with 3 genotypes of interest (T/T, C/T, C/C). All models were adjusted for age and gender. Results Direction of association between T allele and RNFL thickness was consistent in the global and different sectoral RNFL regions. Each copy of the T risk allele in rs10483727 was associated with −0.16 μm thinner global RNFL thickness (β=−0.16, 95% CI: −0.28 to −0.03; P=0.01). Similar patterns were found for the sectoral regions, including inferior (P=0.03), inferior-nasal (P=0.017), superior-nasal (P=0.0025), superior (P=0.002) and superior-temporal (P=0.008). The greatest differences were observed in the superior and inferior quadrants, supporting clinical observations for RNFL thinning in glaucoma. Thinner global RNFL was found in subjects with T/T genotypes compared to subjects with C/T and C/C genotypes (P=0.044). Conclusions Each copy of the T risk allele has an additive effect and was associated with thinner global and sectoral RNFL. Findings from this QTL analysis further support a genetic contribution to glaucoma pathophysiology. PMID:25849520

  9. Association between physical activity and risk of nonalcoholic fatty liver disease: a meta-analysis.

    PubMed

    Qiu, Shanhu; Cai, Xue; Sun, Zilin; Li, Ling; Zügel, Martina; Steinacker, Jürgen Michael; Schumann, Uwe

    2017-09-01

    Increased physical activity (PA) is a key element in the management of patients with nonalcoholic fatty liver disease (NAFLD); however, its association with NAFLD risk has not been systematically assessed. This meta-analysis of observational studies was to quantify this association with dose-response analysis. Electronic databases were searched to January 2017 for studies of adults reporting the risk of NAFLD in relation to PA with cohort or case-control designs. Studies that reported sex-specific data were included as separate studies. The overall risk estimates were pooled using a random-effects model, and the dose-response analysis was conducted to shape the quantitative relationship. A total of 6 cohort studies from 5 articles with 32,657 incident NAFLD cases from 142,781 participants, and 4 case-control studies from 3 articles with 382 NAFLD cases and 302 controls were included. Compared with the lowest PA level, the highest PA level was associated with a risk reduction of NAFLD in cohort [RR (risk ratio) 0.79, 95% CI (confidence interval) 0.71-0.89] and case-control studies [OR (odds ratio) 0.43, 95% CI 0.27-0.68]. For cohort studies, both highest and moderate PA levels were superior to the light one in lowering NAFLD risk ( p for interaction = 0.006 and 0.02, respectively), and there was a log-linear dose-response association ( p for nonlinearity = 0.10) between PA and NAFLD risk [RR 0.82 (95% CI 0.73-0.91) for every 500 metabolic equivalent (MET)-minutes/week increment in PA]. Increased PA may lead to a reduced risk of NAFLD in a dose-dependent manner, and the current guideline-recommended minimum PA level that approximates to 500 MET-minutes/week is able to moderately reduce the NAFLD risk.

  10. Consumption of Yogurt and the Incident Risk of Cardiovascular Disease: A Meta-Analysis of Nine Cohort Studies.

    PubMed

    Wu, Lei; Sun, Dali

    2017-03-22

    Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I² = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics.

  11. Consumption of Yogurt and the Incident Risk of Cardiovascular Disease: A Meta-Analysis of Nine Cohort Studies

    PubMed Central

    Wu, Lei; Sun, Dali

    2017-01-01

    Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I2 = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics. PMID:28327514

  12. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    PubMed

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  13. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  14. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  15. Performance of Two Quantitative PCR Methods for Microbial Source Tracking of Human Sewage and Implications for Microbial Risk Assessment in Recreational Waters

    PubMed Central

    Staley, Christopher; Gordon, Katrina V.; Schoen, Mary E.

    2012-01-01

    Before new, rapid quantitative PCR (qPCR) methods for assessment of recreational water quality and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant source has been diluted in environmental waters is needed. This study determined the limits of detection and quantification of the human-associated Bacteroides sp. (HF183) and human polyomavirus (HPyV) qPCR methods for sewage diluted in buffer and in five ambient, Florida water types (estuarine, marine, tannic, lake, and river). HF183 was quantifiable in sewage diluted up to 10−6 in 500-ml ambient-water samples, but HPyVs were not quantifiable in dilutions of >10−4. Specificity, which was assessed using fecal composites from dogs, birds, and cattle, was 100% for HPyVs and 81% for HF183. Quantitative microbial risk assessment (QMRA) estimated the possible norovirus levels in sewage and the human health risk at various sewage dilutions. When juxtaposed with the MST marker detection limits, the QMRA analysis revealed that HF183 was detectable when the modeled risk of gastrointestinal (GI) illness was at or below the benchmark of 10 illnesses per 1,000 exposures, but the HPyV method was generally not sensitive enough to detect potential health risks at the 0.01 threshold for frequency of illness. The tradeoff between sensitivity and specificity in the MST methods indicates that HF183 data should be interpreted judiciously, preferably in conjunction with a more host-specific marker, and that better methods of concentrating HPyVs from environmental waters are needed if this method is to be useful in a watershed management or monitoring context. PMID:22885746

  16. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... Evaluation and Research (CBER) and suggestions for further development. The public workshop will include... Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, suite 200N, Rockville... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...

  17. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    NASA Astrophysics Data System (ADS)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  18. Quantitative Analysis of Food and Feed Samples with Droplet Digital PCR

    PubMed Central

    Morisset, Dany; Štebih, Dejan; Milavec, Mojca; Gruden, Kristina; Žel, Jana

    2013-01-01

    In this study, the applicability of droplet digital PCR (ddPCR) for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies) of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR) approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed. PMID:23658750

  19. Communicating quantitative risks and benefits in promotional prescription drug labeling or print advertising.

    PubMed

    West, Suzanne L; Squiers, Linda B; McCormack, Lauren; Southwell, Brian G; Brouwer, Emily S; Ashok, Mahima; Lux, Linda; Boudewyns, Vanessa; O'Donoghue, Amie; Sullivan, Helen W

    2013-05-01

    Under the Food, Drug, and Cosmetic Act, all promotional materials for prescription drugs must strike a fair balance in presentation of risks and benefits. How to best present this information is not clear. We sought to determine if the presentation of quantitative risk and benefit information in drug advertising and labeling influences consumers', patients', and clinicians' information processing, knowledge, and behavior by assessing available empirical evidence. We used PubMed for a literature search, limiting to articles published in English from 1990 forward. Two reviewers independently reviewed the titles and abstracts for inclusion, after which we reviewed the full texts to determine if they communicated risk/benefit information either: (i) numerically (e.g., percent) versus non-numerically (e.g., using text such as "increased risk") or (ii) numerically using different formats (e.g., "25% of patients", "one in four patients", or use of pictographs). We abstracted information from included articles into standardized evidence tables. The research team identified a total of 674 relevant publications, of which 52 met our inclusion criteria. Of these, 37 focused on drugs. Presenting numeric information appears to improve understanding of risks and benefits relative to non-numeric presentation; presenting both numeric and non-numeric information when possible may be best practice. No single specific format or graphical approach emerged as consistently superior. Numeracy and health literacy also deserve more empirical attention as moderators. Copyright © 2013 John Wiley & Sons, Ltd.

  20. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  1. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  2. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  3. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  4. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  5. Race and Older Mothers’ Differentiation: A Sequential Quantitative and Qualitative Analysis

    PubMed Central

    Sechrist, Jori; Suitor, J. Jill; Riffin, Catherine; Taylor-Watson, Kadari; Pillemer, Karl

    2011-01-01

    The goal of this paper is to demonstrate a process by which qualitative and quantitative approaches are combined to reveal patterns in the data that are unlikely to be detected and confirmed by either method alone. Specifically, we take a sequential approach to combining qualitative and quantitative data to explore race differences in how mothers differentiate among their adult children. We began with a standard multivariate analysis examining race differences in mothers’ differentiation among their adult children regarding emotional closeness and confiding. Finding no race differences in this analysis, we conducted an in-depth comparison of the Black and White mothers’ narratives to determine whether there were underlying patterns that we had been unable to detect in our first analysis. Using this method, we found that Black mothers were substantially more likely than White mothers to emphasize interpersonal relationships within the family when describing differences among their children. In our final step, we developed a measure of familism based on the qualitative data and conducted a multivariate analysis to confirm the patterns revealed by the in-depth comparison of the mother’s narratives. We conclude that using such a sequential mixed methods approach to data analysis has the potential to shed new light on complex family relations. PMID:21967639

  6. Quantitative microbial risk assessment to estimate the health risk from exposure to noroviruses in polluted surface water in South Africa.

    PubMed

    Van Abel, Nicole; Mans, Janet; Taylor, Maureen B

    2017-10-01

    This study assessed the risks posed by noroviruses (NoVs) in surface water used for drinking, domestic, and recreational purposes in South Africa (SA), using a quantitative microbial risk assessment (QMRA) methodology that took a probabilistic approach coupling an exposure assessment with four dose-response models to account for uncertainty. Water samples from three rivers were found to be contaminated with NoV GI (80-1,900 gc/L) and GII (420-9,760 gc/L) leading to risk estimates that were lower for GI than GII. The volume of water consumed and the probabilities of infection were lower for domestic (2.91 × 10 -8 to 5.19 × 10 -1 ) than drinking water exposures (1.04 × 10 -5 to 7.24 × 10 -1 ). The annual probabilities of illness varied depending on the type of recreational water exposure with boating (3.91 × 10 -6 to 5.43 × 10 -1 ) and swimming (6.20 × 10 -6 to 6.42 × 10 -1 ) being slightly greater than playing next to/in the river (5.30 × 10 -7 to 5.48 × 10 -1 ). The QMRA was sensitive to the choice of dose-response model. The risk of NoV infection or illness from contaminated surface water is extremely high in SA, especially for lower socioeconomic individuals, but is similar to reported risks from limited international studies.

  7. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  8. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  9. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  10. QuASAR: quantitative allele-specific analysis of reads.

    PubMed

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. QuASAR: quantitative allele-specific analysis of reads

    PubMed Central

    Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-01-01

    Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375

  12. A Systematic Literature Review and Meta-Regression Analysis on Early-Life Energy Restriction and Cancer Risk in Humans.

    PubMed

    Elands, Rachel J J; Simons, Colinda C J M; Dongen, Martien van; Schouten, Leo J; Verhage, Bas A J; van den Brandt, Piet A; Weijenberg, Matty P

    2016-01-01

    In animal models, long-term moderate energy restriction (ER) is reported to decelerate carcinogenesis, whereas the effect of severe ER is inconsistent. The impact of early-life ER on cancer risk has never been reviewed systematically and quantitatively based on observational studies in humans. We conducted a systematic review of observational studies and a meta-(regression) analysis on cohort studies to clarify the association between early-life ER and organ site-specific cancer risk. PubMed and EMBASE (1982 -August 2015) were searched for observational studies. Summary relative risks (RRs) were estimated using a random effects model when available ≥3 studies. Twenty-four studies were included. Eleven publications, emanating from seven prospective cohort studies and some reporting on multiple cancer endpoints, met the inclusion criteria for quantitative analysis. Women exposed to early-life ER (ranging from 220-1660 kcal/day) had a higher breast cancer risk than those not exposed (RRRE all ages = 1.28, 95% CI: 1.05-1.56; RRRE for 10-20 years of age = 1.21, 95% CI: 1.09-1.34). Men exposed to early-life ER (ranging from 220-800kcal/day) had a higher prostate cancer risk than those not exposed (RRRE = 1.16, 95% CI: 1.03-1.30). Summary relative risks were not computed for colorectal cancer, because of heterogeneity, and for stomach-, pancreas-, ovarian-, and respiratory cancer because there were <3 available studies. Longer duration of exposure to ER, after adjustment for severity, was positively associated with overall cancer risk in women (p = 0.02). Ecological studies suggest that less severe ER is generally associated with a reduced risk of cancer. Early-life transient severe ER seems to be associated with increased cancer risk in the breast (particularly ER exposure at adolescent age) and prostate. The duration, rather than severity of exposure to ER, seems to positively influence relative risk estimates. This result should be interpreted with caution due to the

  13. Quantitative analysis of NMR spectra with chemometrics

    NASA Astrophysics Data System (ADS)

    Winning, H.; Larsen, F. H.; Bro, R.; Engelsen, S. B.

    2008-01-01

    The number of applications of chemometrics to series of NMR spectra is rapidly increasing due to an emerging interest for quantitative NMR spectroscopy e.g. in the pharmaceutical and food industries. This paper gives an analysis of advantages and limitations of applying the two most common chemometric procedures, Principal Component Analysis (PCA) and Multivariate Curve Resolution (MCR), to a designed set of 231 simple alcohol mixture (propanol, butanol and pentanol) 1H 400 MHz spectra. The study clearly demonstrates that the major advantage of chemometrics is the visualisation of larger data structures which adds a new exploratory dimension to NMR research. While robustness and powerful data visualisation and exploration are the main qualities of the PCA method, the study demonstrates that the bilinear MCR method is an even more powerful method for resolving pure component NMR spectra from mixtures when certain conditions are met.

  14. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  15. A meta-analysis including dose-response relationship between night shift work and the risk of colorectal cancer

    PubMed Central

    Wang, Xiao; Ji, Alin; Zhu, Yi; Liang, Zhen; Wu, Jian; Li, Shiqi; Meng, Shuai; Zheng, Xiangyi; Xie, Liping

    2015-01-01

    A meta-analysis was conducted to quantitatively evaluate the correlation between night shift work and the risk of colorectal cancer. We searched for publications up to March 2015 using PubMed, Web of Science, Cochrane Library, EMBASE and the Chinese National Knowledge Infrastructure databases, and the references of the retrieved articles and relevant reviews were also checked. OR and 95% CI were used to assess the degree of the correlation between night shift work and risk of colorectal cancer via fixed- or random-effect models. A dose-response meta-analysis was performed as well. The pooled OR estimates of the included studies illustrated that night shift work was correlated with an increased risk of colorectal cancer (OR = 1.318, 95% CI 1.121–1.551). No evidence of publication bias was detected. In the dose-response analysis, the rate of colorectal cancer increased by 11% for every 5 years increased in night shift work (OR = 1.11, 95% CI 1.03–1.20). In conclusion, this meta-analysis indicated that night shift work was associated with an increased risk of colorectal cancer. Further researches should be conducted to confirm our findings and clarify the potential biological mechanisms. PMID:26208480

  16. A concept analysis of forensic risk.

    PubMed

    Kettles, A M

    2004-08-01

    Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.

  17. IWGT report on quantitative approaches to genotoxicity risk assessment II. Use of point-of-departure (PoD) metrics in defining acceptable exposure limits and assessing human risk

    EPA Science Inventory

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the ne...

  18. Quantitative Risk Assessment of Antimicrobial-Resistant Foodborne Infections in Humans Due to Recombinant Bovine Somatotropin Usage in Dairy Cows.

    PubMed

    Singer, Randall S; Ruegg, Pamela L; Bauman, Dale E

    2017-07-01

    Recombinant bovine somatotropin (rbST) is a production-enhancing technology that allows the dairy industry to produce milk more efficiently. Concern has been raised that cows supplemented with rbST are at an increased risk of developing clinical mastitis, which would potentially increase the use of antimicrobial agents and increase human illnesses associated with antimicrobial-resistant bacterial pathogens delivered through the dairy beef supply. The purpose of this study was to conduct a quantitative risk assessment to estimate the potential increased risk of human infection with antimicrobial-resistant bacteria and subsequent adverse health outcomes as a result of rbST usage in dairy cattle. The quantitative risk assessment included the following steps: (i) release of antimicrobial-resistant organisms from the farm, (ii) exposure of humans via consumption of contaminated beef products, and (iii) consequence of the antimicrobial-resistant infection. The model focused on ceftiofur (parenteral and intramammary) and oxytetracycline (parenteral) treatment of clinical mastitis in dairy cattle and tracked the bacteria Campylobacter spp., Salmonella enterica subsp. enterica, and Escherichia coli in the gastrointestinal tract of the cow. Parameter estimates were developed to be maximum risk to overestimate the risk to humans. The excess number of cows in the U.S. dairy herd that were predicted to carry resistant bacteria at slaughter due to rbST administration was negligible. The total number of excess human illnesses caused by resistant bacteria due to rbST administration was also predicted to be negligible with all risks considerably less than one event per 1 billion people at risk per year for all bacteria. The results indicate a high probability that the use of rbST according to label instructions presents a negligible risk for increasing the number of human illnesses and subsequent adverse outcomes associated with antimicrobial-resistant Campylobacter, Salmonella, or

  19. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  20. Maternal vitamin D status during pregnancy and risk of childhood asthma: A meta-analysis of prospective studies.

    PubMed

    Song, Huihui; Yang, Lei; Jia, Chongqi

    2017-05-01

    Mounting evidence suggests that maternal vitamin D status during pregnancy may be associated with development of childhood asthma, but the results are still inconsistent. A dose-response meta-analysis was performed to quantitatively summarize evidence on the association of maternal vitamin D status during pregnancy with the risk of childhood asthma. A systematic search was conducted to identify all studies assessing the association of maternal 25-hydroxyvitamin D (25(OH)D) during pregnancy with risk of childhood asthma. The fixed or random-effect model was selected based on the heterogeneity test among studies. Nonlinear dose-response relationship was assessed by restricted cubic spline model. Fifteen prospective studies with 12 758 participants and 1795 cases were included in the meta-analysis. The pooled relative risk of childhood asthma comparing the highest versus lowest category of maternal 25(OH)D levels was 0.87 (95% confidence interval, CI, 0.75-1.02). For dose-response analysis, evidence of a U-shaped relationship was found between maternal 25(OH)D levels and risk of childhood asthma (P nonlinearity = 0.02), with the lowest risk at approximately 70 nmol/L of 25(OH)D. This dose-response meta-analysis suggested a U-shaped relationship between maternal blood 25(OH)D levels and risk of childhood asthma. Further studies are needed to confirm the association. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  2. Fasting insulin, insulin resistance and risk of hypertension in the general population: A meta-analysis.

    PubMed

    Wang, Feng; Han, Lili; Hu, Dayi

    2017-01-01

    Studies on the association of fasting insulin concentrations or insulin resistance with subsequent risk of hypertension have yielded conflicting results. To quantitatively assess the association of fasting insulin concentrations or homeostasis model assessment insulin resistance (HOMA-IR) with incident hypertension in a general population by performing a meta-analysis. We searched the PubMed and Embase databases until August 31, 2016 for prospective observational studies investigating the elevated fasting insulin concentrations or HOMA-IR with subsequent risk of hypertension in the general population. Pooled risk ratio (RR) and 95% confidence interval (CI) of hypertension was calculated for the highest versus the lowest category of fasting insulin or HOMA-IR. Eleven studies involving 10,230 hypertension cases were identified from 55,059 participants. Meta-analysis showed that the pooled adjusted RR of hypertension was 1.54 (95% CI 1.34-1.76) for fasting insulin concentrations and 1.43 (95% CI 1.27-1.62) for HOMA-IR comparing the highest to the lowest category. Subgroup analysis results showed that the association of fasting insulin concentrations with subsequent risk of hypertension seemed more pronounced in women (RR 2.07; 95% CI 1.19-3.60) than in men (RR 1.48; 95% CI 1.17-1.88). This meta-analysis suggests that elevated fasting insulin concentrations or insulin resistance as estimated by homeostasis model assessment is independently associated with an exacerbated risk of hypertension in the general population. Early intervention of hyperinsulinemia or insulin resistance may help clinicians to identify the high risk of hypertensive population. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    PubMed

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the

  4. Dietary patterns and risk of colorectal adenoma: a systematic review and meta-analysis of observational studies.

    PubMed

    Godos, J; Bella, F; Torrisi, A; Sciacca, S; Galvano, F; Grosso, G

    2016-12-01

    Current evidence suggests that dietary patterns may play an important role in colorectal cancer risk. The present study aimed to perform a systematic review and meta-analysis of observational studies exploring the association between dietary patterns and colorectal adenomas (a precancerous condition). Pubmed and EMBASE electronic databases were systematically searched to retrieve eligible studies. Only studies exploring the risk or association with colorectal adenomas for the highest versus lowest category of exposure to a posteriori dietary patterns were included in the quantitative analysis. Random-effects models were applied to calculate relative risks (RRs) of colorectal adenomas for high adherence to healthy or unhealthy dietary patterns. Statistical heterogeneity and publication bias were explored. Twelve studies were reviewed. Three studies explored a priori dietary patterns using scores identifying adherence to the Mediterranean, Paleolithic and Dietary Approaches to Stop Hypertension (DASH) diet and reported an association with decreased colorectal adenoma risk. Two studies tested the association with colorectal adenomas between a posteriori dietary patterns showing lower odds of disease related to plant-based compared to meat-based dietary patterns. Seven studies identified 23 a posteriori dietary patterns and the analysis revealed that higher adherence to healthy and unhealthy dietary patterns was significantly associated risk of colorectal adenomas (RR = 0.81, 95% confidence interval = 0.71, 0.94 and RR = 1.24, 95% confidence interval = 1.13, 1.35, respectively) with no evidence of heterogeneity or publication bias. The results of this systematic review and meta-analysis indicate that dietary patterns may be associated with the risk of colorectal adenomas. © 2016 The British Dietetic Association Ltd.

  5. Oufti: An integrated software package for high-accuracy, high-throughput quantitative microscopy analysis

    PubMed Central

    Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine

    2016-01-01

    Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279

  6. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    PubMed

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2017-06-01

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  7. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  8. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  9. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  10. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less

  11. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2018-02-01

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  12. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGES

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of C π...C πinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  13. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    NASA Astrophysics Data System (ADS)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  14. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, B.D.; Toole, A.P.; Callahan, B.G.

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromaticmore » ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.« less

  15. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  16. Quantitative optical coherence tomography analysis for late in-stent restenotic lesions.

    PubMed

    Fu, Qiang; Suzuki, Nobuaki; Kozuma, Ken; Miyagawa, Mutsuki; Nomura, Takahiro; Kawashima, Hideyuki; Shiratori, Yoshitaka; Ishikawa, Shuichi; Kyono, Hiroyuki; Isshiki, Takaaki

    2015-01-01

    Coronary optical coherence tomography (OCT) has the potential to identify in-stent neoatherosclerosis, which is a possible risk factor for late acute coronary events after drug-eluting stent implantation. The purpose of this study was to investigate differences between mid-term and late in-stent restenosis after stent implantation by quantitative and semiautomated tissue property analysis using OCT. In total, 1063 OCT image frames of 16 lesions in 15 patients were analyzed. This included 346 frames of 6 lesions in late in-stent restenosis, which was defined as restenosis that was not detected at 6 to 12 months but ≥ 12 months after follow-up coronary angiography. Signal attenuation was circumferentially analyzed using a dedicated semiautomated software. Attenuation was assessed along 200 lines delineated radially for analysis of the in-stent restenotic lesions (between the lumen and stent contours). All lines were anchored by the image wire to avoid artifacts resulting from wire location. Stronger signal attenuation at the frame level (2.46 ± 0.78 versus 1.47 ± 0.32, P < 0.001) and higher maximum signal intensity at the lesion level (9.19 ± 0.19 versus 8.84 ± 0.32, P = 0.018) were observed in late in-stent restenotic lesions than in mid-term in-stent restenotic lesions. OCT demonstrated stronger signal attenuation and higher maximum signal intensity in late in-stent restenotic lesions than in mid-term in-stent restenotic lesions, indicating the possibility of neoatherosclerosis.

  17. Quantitative Experimental Determination of Primer-Dimer Formation Risk by Free-Solution Conjugate Electrophoresis

    PubMed Central

    Desmarais, Samantha M.; Leitner, Thomas; Barron, Annelise E.

    2012-01-01

    DNA barcodes are short, unique ssDNA primers that “mark” individual biomolecules. To gain better understanding of biophysical parameters constraining primer-dimer formation between primers that incorporate barcode sequences, we have developed a capillary electrophoresis method that utilizes drag-tag-DNA conjugates to quantify dimerization risk between primer-barcode pairs. Results obtained with this unique free-solution conjugate electrophoresis (FSCE) approach are useful as quantitatively precise input data to parameterize computation models of dimerization risk. A set of fluorescently labeled, model primer-barcode conjugates were designed with complementary regions of differing lengths to quantify heterodimerization as a function of temperature. Primer-dimer cases comprised two 30-mer primers, one of which was covalently conjugated to a lab-made, chemically synthesized poly-N-methoxyethylglycine drag-tag, which reduced electrophoretic mobility of ssDNA to distinguish it from ds primer-dimers. The drag-tags also provided a shift in mobility for the dsDNA species, which allowed us to quantitate primer-dimer formation. In the experimental studies, pairs of oligonucleotide primer-barcodes with fully or partially complementary sequences were annealed, and then separated by free-solution conjugate CE at different temperatures, to assess effects on primer-dimer formation. When less than 30 out of 30 basepairs were bonded, dimerization was inversely correlated to temperature. Dimerization occurred when more than 15 consecutive basepairs formed, yet non-consecutive basepairs did not create stable dimers even when 20 out of 30 possible basepairs bonded. The use of free-solution electrophoresis in combination with a peptoid drag-tag and different fluorophores enabled precise separation of short DNA fragments to establish a new mobility shift assay for detection of primer-dimer formation. PMID:22331820

  18. Bridging the two cultures of risk analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasanoff, S.

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topicsmore » include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.« less

  19. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    PubMed

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    PubMed

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p < 0.0001), 0.062 for SD v (AUC: 0.847, p < 0.0001), 0.117 for A 1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  1. A comparison of two prospective risk analysis methods: Traditional FMEA and a modified healthcare FMEA.

    PubMed

    Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya

    2016-12-01

    To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.

  2. Quantitative Risk - Phase 1

    DTIC Science & Technology

    2013-09-03

    SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle ......................................... 25

  3. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  4. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.

    2004-04-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less

  5. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  6. Detection of hemoplasma infection of goats by use of a quantitative polymerase chain reaction assay and risk factor analysis for infection.

    PubMed

    Johnson, Kathy A; do Nascimento, Naíla C; Bauer, Amy E; Weng, Hsin-Yi; Hammac, G Kenitra; Messick, Joanne B

    2016-08-01

    OBJECTIVE To develop and validate a real-time quantitative PCR (qPCR) assay for the detection and quantification of Mycoplasma ovis in goats and investigate the prevalence and risk factors for hemoplasma infection of goats located in Indiana. ANIMALS 362 adult female goats on 61 farms. PROCEDURES Primers were designed for amplification of a fragment of the dnaK gene of M ovis by use of a qPCR assay. Blood samples were collected into EDTA-containing tubes for use in total DNA extraction, blood film evaluation, and determination of PCV. Limit of detection, intra-assay variability, interassay variability, and specificity of the assay were determined. RESULTS Reaction efficiency of the qPCR assay was 94.45% (R(2), 0.99; slope, -3.4623), and the assay consistently detected as few as 10 copies of plasmid/reaction. Prevalence of infection in goats on the basis of results for the qPCR assay was 18.0% (95% confidence interval, 14% to 22%), with infected goats ranging from 1 to 14 years old, whereby 61% (95% confidence interval, 47% to 73%) of the farms had at least 1 infected goat. Bacterial load in goats infected with M ovis ranged from 1.05 × 10(3) target copies/mL of blood to 1.85 × 10(5) target copies/mL of blood; however, no bacteria were observed on blood films. Production use of a goat was the only risk factor significantly associated with hemoplasma infection. CONCLUSIONS AND CLINICAL RELEVANCE The qPCR assay was more sensitive for detecting hemoplasma infection than was evaluation of a blood film, and production use of a goat was a risk factor for infection.

  7. Development of quantitative exposure data for a pooled exposure-response analysis of 10 silica cohorts.

    PubMed

    Mannetje, Andrea 't; Steenland, Kyle; Checkoway, Harvey; Koskela, Riitta-Sisko; Koponen, Matti; Attfield, Michael; Chen, Jingqiong; Hnizdo, Eva; DeKlerk, Nicholas; Dosemeci, Mustafa

    2002-08-01

    Comprehensive quantitative silica exposure estimates over time, measured in the same units across a number of cohorts, would make possible a pooled exposure-response analysis for lung cancer. Such an analysis would help clarify the continuing controversy regarding whether silica causes lung cancer. Existing quantitative exposure data for 10 silica-exposed cohorts were retrieved from the original investigators. Occupation- and time-specific exposure estimates were either adopted/adapted or developed for each cohort, and converted to milligram per cubic meter (mg/m(3)) respirable crystalline silica. Quantitative exposure assignments were typically based on a large number (thousands) of raw measurements, or otherwise consisted of exposure estimates by experts (for two cohorts). Median exposure level of the cohorts ranged between 0.04 and 0.59 mg/m(3) respirable crystalline silica. Exposure estimates were partially validated via their successful prediction of silicosis in these cohorts. Existing data were successfully adopted or modified to create comparable quantitative exposure estimates over time for 10 silica-exposed cohorts, permitting a pooled exposure-response analysis. The difficulties encountered in deriving common exposure estimates across cohorts are discussed. Copyright 2002 Wiley-Liss, Inc.

  8. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  9. A quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sex among HIV-seropositive men.

    PubMed

    Gerbi, Gemechu B; Habtemariam, Tsegaye; Tameru, Berhanu; Nganwa, David; Robnett, Vinaida

    2012-01-01

    The objective of this study is to conduct a quantitative risk assessment of multiple factors influencing HIV/AIDS transmission through unprotected sexual practices among HIV-seropositive men. A knowledgebase was developed by reviewing different published sources. The data were collected from different sources including Centers for Disease Control and Prevention, selected journals, and reports. The risk pathway scenario tree was developed based on a comprehensive review of published literature. The variables are organized into nine major parameter categories. Monte Carlo simulations for the quantitative risk assessment of HIV/AIDS transmission was executed with the software @Risk 4.0 (Palisade Corporation). Results show that the value for the likelihood of unprotected sex due to having less knowledge about HIV/AIDS and negative attitude toward condom use and safer sex ranged from 1.24 × 10(-5) to 8.47 × 10(-4) with the mean and standard deviation of 1.83 × 10(-4) and 8.63 × 10(-5), respectively. The likelihood of unprotected sex due to having greater anger-hostility, anxiety, less satisfied with aspects of life, and greater depressive symptoms ranged from 2.76 × 10(-9) to 5.34 × 10(-7) with the mean and standard deviation of 5.23 × 10(-8) and 3.58 × 10(-8), respectively. The findings suggest that HIV/AIDS research and intervention programs must be focused on behavior, and the broader setting within which individual risky behaviors occur.

  10. Advances in Risk Analysis with Big Data.

    PubMed

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  11. Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel

    NASA Astrophysics Data System (ADS)

    Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying

    2018-05-01

    Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.

  12. Characterization of breast lesion using T1-perfusion magnetic resonance imaging: Qualitative vs. quantitative analysis.

    PubMed

    Thakran, S; Gupta, P K; Kabra, V; Saha, I; Jain, P; Gupta, R K; Singh, A

    2018-06-14

    The objective of this study was to quantify the hemodynamic parameters using first pass analysis of T 1 -perfusion magnetic resonance imaging (MRI) data of human breast and to compare these parameters with the existing tracer kinetic parameters, semi-quantitative and qualitative T 1 -perfusion analysis in terms of lesion characterization. MRI of the breast was performed in 50 women (mean age, 44±11 [SD] years; range: 26-75) years with a total of 15 benign and 35 malignant breast lesions. After pre-processing, T 1 -perfusion MRI data was analyzed using qualitative approach by two radiologists (visual inspection of the kinetic curve into types I, II or III), semi-quantitative (characterization of kinetic curve types using empirical parameters), generalized-tracer-kinetic-model (tracer kinetic parameters) and first pass analysis (hemodynamic-parameters). Chi-squared test, t-test, one-way analysis-of-variance (ANOVA) using Bonferroni post-hoc test and receiver-operating-characteristic (ROC) curve were used for statistical analysis. All quantitative parameters except leakage volume (Ve), qualitative (type-I and III) and semi-quantitative curves (type-I and III) provided significant differences (P<0.05) between benign and malignant lesions. Kinetic parameters, particularly volume transfer coefficient (K trans ) provided a significant difference (P<0.05) between all grades except grade-II vs III. The hemodynamic parameter (relative-leakage-corrected-breast-blood-volume [rBBVcorr) provided a statistically significant difference (P<0.05) between all grades. It also provided highest sensitivity and specificity among all parameters in differentiation between different grades of malignant breast lesions. Quantitative parameters, particularly rBBVcorr and K trans provided similar sensitivity and specificity in differentiating benign from malignant breast lesions for this cohort. Moreover, rBBVcorr provided better differentiation between different grades of malignant breast

  13. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    PubMed

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  14. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  15. A quantitative assessment of the risk for highly pathogenic avian influenza introduction into Spain via legal trade of live poultry.

    PubMed

    Sánchez-Vizcaíno, Fernando; Perez, Andrés; Lainez, Manuel; Sánchez-Vizcaíno, José Manuel

    2010-05-01

    Highly pathogenic avian influenza (HPAI) is considered one of the most important diseases of poultry. During the last 9 years, HPAI epidemics have been reported in Asia, the Americas, Africa, and in 18 countries of the European Union (EU). For that reason, it is possible that the risk for HPAI virus (HPAIV) introduction into Spain may have recently increased. Because of the EU free-trade policy and because legal trade of live poultry was considered an important route for HPAI spread in certain regions of the world, there are fears that Spain may become HPAIV-infected as a consequence of the legal introduction of live poultry. However, no quantitative assessment of the risk for HPAIV introduction into Spain or into any other EU member state via the trade of poultry has been published in the peer-reviewed literature. This article presents the results of the first quantitative assessment of the risk for HPAIV introduction into a free country via legal trade of live poultry, along with estimates of the geographical variation of the risk and of the relative contribution of exporting countries and susceptible poultry species to the risk. The annual mean risk for HPAI introduction into Spain was estimated to be as low as 1.36 x 10(-3), suggesting that under prevailing conditions, introduction of HPAIV into Spain through the trade of live poultry is unlikely to occur. Moreover, these results support the hypothesis that legal trade of live poultry does not impose a significant risk for the spread of HPAI into EU member states.

  16. Quantitative analysis of dinuclear manganese(II) EPR spectra

    NASA Astrophysics Data System (ADS)

    Golombek, Adina P.; Hendrich, Michael P.

    2003-11-01

    A quantitative method for the analysis of EPR spectra from dinuclear Mn(II) complexes is presented. The complex [(Me 3TACN) 2Mn(II) 2(μ-OAc) 3]BPh 4 ( 1) (Me 3TACN= N, N', N''-trimethyl-1,4,7-triazacyclononane; OAc=acetate 1-; BPh 4=tetraphenylborate 1-) was studied with EPR spectroscopy at X- and Q-band frequencies, for both perpendicular and parallel polarizations of the microwave field, and with variable temperature (2-50 K). Complex 1 is an antiferromagnetically coupled dimer which shows signals from all excited spin manifolds, S=1 to 5. The spectra were simulated with diagonalization of the full spin Hamiltonian which includes the Zeeman and zero-field splittings of the individual manganese sites within the dimer, the exchange and dipolar coupling between the two manganese sites of the dimer, and the nuclear hyperfine coupling for each manganese ion. All possible transitions for all spin manifolds were simulated, with the intensities determined from the calculated probability of each transition. In addition, the non-uniform broadening of all resonances was quantitatively predicted using a lineshape model based on D- and r-strain. As the temperature is increased from 2 K, an 11-line hyperfine pattern characteristic of dinuclear Mn(II) is first observed from the S=3 manifold. D- and r-strain are the dominate broadening effects that determine where the hyperfine pattern will be resolved. A single unique parameter set was found to simulate all spectra arising for all temperatures, microwave frequencies, and microwave modes. The simulations are quantitative, allowing for the first time the determination of species concentrations directly from EPR spectra. Thus, this work describes the first method for the quantitative characterization of EPR spectra of dinuclear manganese centers in model complexes and proteins. The exchange coupling parameter J for complex 1 was determined ( J=-1.5±0.3 cm-1; H ex=-2J S1· S2) and found to be in agreement with a previous

  17. Quantitative approach for incorporating methylmercury risks and omega-3 fatty acid benefits in developing species-specific fish consumption advice.

    PubMed

    Ginsberg, Gary L; Toal, Brian F

    2009-02-01

    Despite general agreement about the toxicity of methylmercury (MeHg), fish consumption advice remains controversial. Concerns have been raised that negative messages will steer people away from fish and omega-3 fatty acid (FA) benefits. One approach is to provide advice for individual species that highlights beneficial fish while cautioning against riskier fish. Our goal in this study was to develop a method to quantitatively analyze the net risk/benefit of individual fish species based on their MeHg and omega-3 FA content. We identified dose-response relationships for MeHg and omega-3 FA effects on coronary heart disease (CHD) and neurodevelopment. We used the MeHg and omega-3 FA content of 16 commonly consumed species to calculate the net risk/benefit for each species. Estimated omega-3 FA benefits outweigh MeHg risks for some species (e.g., farmed salmon, herring, trout); however, the opposite was true for others (swordfish, shark). Other species were associated with a small net benefit (e.g., flounder, canned light tuna) or a small net risk (e.g., canned white tuna, halibut). These results were used to place fish into one of four meal frequency categories, with the advice tentative because of limitations in the underlying dose-response information. Separate advice appears warranted for the neurodevelopmental risk group versus the cardiovascular risk group because we found a greater net benefit from fish consumption for the cardiovascular risk group. This research illustrates a framework for risk/benefit analysis that can be used to develop categories of consumption advice ranging from "do not eat" to "unlimited," with the caveat that unlimited may need to be tempered for certain fish (e.g., farm-raised salmon) because of other contaminants and end points (e.g., cancer risk). Uncertainties exist in the underlying dose-response relationships, pointing in particular to the need for more research on the adverse effects of MeHg on cardiovascular end points.

  18. Quantitative Approach for Incorporating Methylmercury Risks and Omega-3 Fatty Acid Benefits in Developing Species-Specific Fish Consumption Advice

    PubMed Central

    Ginsberg, Gary L.; Toal, Brian F.

    2009-01-01

    Background Despite general agreement about the toxicity of methylmercury (MeHg), fish consumption advice remains controversial. Concerns have been raised that negative messages will steer people away from fish and omega-3 fatty acid (FA) benefits. One approach is to provide advice for individual species that highlights beneficial fish while cautioning against riskier fish. Objectives Our goal in this study was to develop a method to quantitatively analyze the net risk/benefit of individual fish species based on their MeHg and omega-3 FA content. Methods We identified dose–response relationships for MeHg and omega-3 FA effects on coronary heart disease (CHD) and neurodevelopment. We used the MeHg and omega-3 FA content of 16 commonly consumed species to calculate the net risk/benefit for each species. Results Estimated omega-3 FA benefits outweigh MeHg risks for some species (e.g., farmed salmon, herring, trout); however, the opposite was true for others (swordfish, shark). Other species were associated with a small net benefit (e.g., flounder, canned light tuna) or a small net risk (e.g., canned white tuna, halibut). These results were used to place fish into one of four meal frequency categories, with the advice tentative because of limitations in the underlying dose–response information. Separate advice appears warranted for the neurodevelopmental risk group versus the cardiovascular risk group because we found a greater net benefit from fish consumption for the cardiovascular risk group. Conclusions This research illustrates a framework for risk/benefit analysis that can be used to develop categories of consumption advice ranging from “do not eat” to “unlimited,” with the caveat that unlimited may need to be tempered for certain fish (e.g., farm-raised salmon) because of other contaminants and end points (e.g., cancer risk). Uncertainties exist in the underlying dose–response relationships, pointing in particular to the need for more research on

  19. Occurrence and quantitative microbial risk assessment of Cryptosporidium and Giardia in soil and air samples.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Mondaca-Fernández, Iram; Balderas-Cortés, José de Jesús; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2014-09-01

    Cryptosporidium oocysts and Giardia cysts can be transmitted by the fecal-oral route and may cause gastrointestinal parasitic zoonoses. These zoonoses are common in rural zones due to the parasites being harbored in fecally contaminated soil. This study assessed the risk of illness (giardiasis and cryptosporidiosis) from inhaling and/or ingesting soil and/or airborne dust in Potam, Mexico. To assess the risk of infection, Quantitative Microbial Risk Assessment (QMRA) was employed, with the following steps: (1) hazard identification, (2) hazard exposure, (3) dose-response, and (4) risk characterization. Cryptosporidium oocysts and Giardia cysts were observed in 52% and 57%, respectively, of total soil samples (n=21), and in 60% and 80%, respectively, of air samples (n=12). The calculated annual risks were higher than 9.9 × 10(-1) for both parasites in both types of sample. Soil and air inhalation and/or ingestion are important vehicles for these parasites. To our knowledge, the results obtained in the present study represent the first QMRAs for cryptosporidiosis and giardiasis due to soil and air inhalation/ingestion in Mexico. In addition, this is the first evidence of the microbial air quality around these parasites in rural zones. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2018-05-01

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  1. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    NASA Astrophysics Data System (ADS)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  2. [The urgent problems of the improvement of the environment management system based on the analysis of health risk assessment].

    PubMed

    Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L

    2014-01-01

    The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".

  3. Risk-based cost-benefit analysis for evaluating microbial risk mitigation in a drinking water system.

    PubMed

    Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars

    2018-04-01

    Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local

  4. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  5. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  6. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  7. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Direct oral anticoagulants for extended thromboprophylaxis in medically ill patients: meta-analysis and risk/benefit assessment.

    PubMed

    Al Yami, Majed S; Kurdi, Sawsan; Abraham, Ivo

    2018-01-01

    Standard-duration (7-10 days) thromboprophylaxis with low molecular weight heparin, low dose unfractionated heparin, or fondaparinux in hospitalized medically ill patients is associated with ~50% reduction in venous thromboembolism (VTE) risk. However, these patients remain at high risk for VTE post-discharge. The direct oral anticoagulants (DOACs) apixaban, rivaroxaban and betrixaban have been evaluated for extended-duration (30-42 days) thromboprophylaxis in this population. We review the efficacy and safety results from the 3 pivotal trials of extended-duration DOAC thromboprophylaxis in medically ill patients. We performed a meta-analysis of these pivotal trials focusing on 6 VTE (efficacy) and three bleeding outcomes (safety). These results were integrated into a quantitative risk/benefit assessment. The trials evaluating extended-duration DOAC thromboprophylaxis in medically ill patients failed to establish clear efficacy and/or safety signals for each agent. Our meta-analysis shows that, as a class, DOACs have selective and partial extended-duration prophylactic activity in preventing VTE events. However, this is associated with a marked increase in the risk of various bleeding events. The risk/benefit analyses fail to show a consistent net clinical benefit of extended-duration DOAC prophylaxis in medically ill patients. At this time, the evidence of safe and effective extended-duration thromboprophylaxis with DOACs in this population is inconclusive.

  9. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens.

    PubMed

    Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B

    2003-05-25

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the

  10. Comparative study of contrast-enhanced ultrasound qualitative and quantitative analysis for identifying benign and malignant breast tumor lumps.

    PubMed

    Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting

    2014-01-01

    To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.

  11. A novel eQTL-based analysis reveals the biology of breast cancer risk loci

    PubMed Central

    Li, Qiyuan; Seo, Ji-Heui; Stranger, Barbara; McKenna, Aaron; Pe'er, Itsik; LaFramboise, Thomas; Brown, Myles; Tyekucheva, Svitlana; Freedman, Matthew L.

    2014-01-01

    Summary Germline determinants of gene expression in tumors are less studied due to the complexity of transcript regulation caused by somatically acquired alterations. We performed expression quantitative trait locus (eQTL) based analyses using the multi-level information provided in The Cancer Genome Atlas (TCGA). Of the factors we measured, cis-acting eQTL saccounted for 1.2% of the total variation of tumor gene expression, while somatic copy number alteration and CpG methylation accounted for 7.3% and 3.3%, respectively. eQTL analyses of 15 previously reported breast cancer risk loci resulted in discovery of three variants that are significantly associated with transcript levels (FDR<0.1). In a novel trans- based analysis, an additional three risk loci were identified to act through ESR1, MYC, and KLF4. These findings provide a more comprehensive picture of gene expression determinants in breast cancer as well as insights into the underlying biology of breast cancer risk loci. PMID:23374354

  12. Novel quantitative analysis of autofluorescence images for oral cancer screening.

    PubMed

    Huang, Tze-Ta; Huang, Jehn-Shyun; Wang, Yen-Yun; Chen, Ken-Chung; Wong, Tung-Yiu; Chen, Yi-Chun; Wu, Che-Wei; Chan, Leong-Perng; Lin, Yi-Chu; Kao, Yu-Hsun; Nioka, Shoko; Yuan, Shyng-Shiou F; Chung, Pau-Choo

    2017-05-01

    VELscope® was developed to inspect oral mucosa autofluorescence. However, its accuracy is heavily dependent on the examining physician's experience. This study was aimed toward the development of a novel quantitative analysis of autofluorescence images for oral cancer screening. Patients with either oral cancer or precancerous lesions and a control group with normal oral mucosa were enrolled in this study. White light images and VELscope® autofluorescence images of the lesions were taken with a digital camera. The lesion in the image was chosen as the region of interest (ROI). The average intensity and heterogeneity of the ROI were calculated. A quadratic discriminant analysis (QDA) was utilized to compute boundaries based on sensitivity and specificity. 47 oral cancer lesions, 54 precancerous lesions, and 39 normal oral mucosae controls were analyzed. A boundary of specificity of 0.923 and a sensitivity of 0.979 between the oral cancer lesions and normal oral mucosae were validated. The oral cancer and precancerous lesions could also be differentiated from normal oral mucosae with a specificity of 0.923 and a sensitivity of 0.970. The novel quantitative analysis of the intensity and heterogeneity of VELscope® autofluorescence images used in this study in combination with a QDA classifier can be used to differentiate oral cancer and precancerous lesions from normal oral mucosae. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    PubMed

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  14. Accuracy improvement of quantitative analysis by spatial confinement in laser-induced breakdown spectroscopy.

    PubMed

    Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F

    2013-07-29

    To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.

  15. Qualitative and quantitative analysis of monomers in polyesters for food contact materials.

    PubMed

    Brenz, Fabrian; Linke, Susanne; Simat, Thomas

    2017-02-01

    Polyesters (PESs) are gaining more importance on the food contact material (FCM) market and the variety of properties and applications is expected to be wide. In order to acquire the desired properties manufacturers can combine several FCM-approved polyvalent carboxylic acids (PCAs) and polyols as monomers. However, information about the qualitative and quantitative composition of FCM articles is often limited. The method presented here describes the analysis of PESs with the identification and quantification of 25 PES monomers (10 PCA, 15 polyols) by HPLC with diode array detection (HPLC-DAD) and GC-MS after alkaline hydrolysis. Accurate identification and quantification were demonstrated by the analysis of seven different FCM articles made of PESs. The results explained between 97.2% and 103.4% w/w of the polymer composition whilst showing equal molar amounts of PCA and polyols. Quantification proved to be precise and sensitive with coefficients of variation (CVs) below 6.0% for PES samples with monomer concentrations typically ranging from 0.02% to 75% w/w. The analysis of 15 PES samples for the FCM market revealed the presence of five different PCAs and 11 different polyols (main monomers, co-monomers, non-intentionally added substances (NIAS)) showing the wide variety of monomers in modern PESs. The presented method provides a useful tool for commercial, state and research laboratories as well as for producers and distributors facing the task of FCM risk assessment. It can be applied for the identification and quantification of migrating monomers and the prediction of oligomer compositions from the identified monomers, respectively.

  16. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  17. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  18. Comparison of recreational health risks associated with surfing and swimming in dry weather and post-storm conditions at Southern California beaches using quantitative microbial risk assessment (QMRA).

    PubMed

    Tseng, Linda Y; Jiang, Sunny C

    2012-05-01

    Southern California is an increasingly urbanized hotspot for surfing, thus it is of great interest to assess the human illness risks associated with this popular ocean recreational water sport from exposure to fecal bacteria contaminated coastal waters. Quantitative microbial risk assessments were applied to eight popular Southern California beaches using readily available enterococcus and fecal coliform data and dose-response models to compare health risks associated with surfing during dry weather and storm conditions. The results showed that the level of gastrointestinal illness risks from surfing post-storm events was elevated, with the probability of exceeding the US EPA health risk guideline up to 28% of the time. The surfing risk was also elevated in comparison with swimming at the same beach due to ingestion of greater volume of water. The study suggests that refinement of dose-response model, improving monitoring practice and better surfer behavior surveillance will improve the risk estimation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Potential application of quantitative microbiological risk assessment techniques to an aseptic-UHT process in the food industry.

    PubMed

    Pujol, Laure; Albert, Isabelle; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2013-04-01

    Aseptic ultra-high-temperature (UHT)-type processed food products (e.g., milk or soup) are ready to eat products which are consumed extensively globally due to a combination of their comparative high quality and long shelf life, with no cold chain or other preservation requirements. Due to the inherent microbial vulnerability of aseptic-UHT product formulations, the safety and stability-related performance objectives (POs) required at the end of the manufacturing process are the most demanding found in the food industry. The key determinants to achieving sterility, and which also differentiates aseptic-UHT from in-pack sterilised products, are the challenges associated with the processes of aseptic filling and sealing. This is a complex process that has traditionally been run using deterministic or empirical process settings. Quantifying the risk of microbial contamination and recontamination along the aseptic-UHT process, using the scientifically based process quantitative microbial risk assessment (QMRA), offers the possibility to improve on the currently tolerable sterility failure rate (i.e., 1 defect per 10,000 units). In addition, benefits of applying QMRA are (i) to implement process settings in a transparent and scientific manner; (ii) to develop a uniform common structure whatever the production line, leading to a harmonisation of these process settings, and; (iii) to bring elements of a cost-benefit analysis of the management measures. The objective of this article is to explore how QMRA techniques and risk management metrics may be applied to aseptic-UHT-type processed food products. In particular, the aseptic-UHT process should benefit from a number of novel mathematical and statistical concepts that have been developed in the field of QMRA. Probabilistic techniques such as Monte Carlo simulation, Bayesian inference and sensitivity analysis, should help in assessing the compliance with safety and stability-related POs set at the end of the manufacturing

  20. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    PubMed Central

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  1. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  2. Accurate quantitation of D+ fetomaternal hemorrhage by flow cytometry using a novel reagent to eliminate granulocytes from analysis.

    PubMed

    Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond

    2014-05-01

    Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.

  3. Quantitative analysis of glycerophospholipids by LC-MS: acquisition, data handling, and interpretation

    PubMed Central

    Myers, David S.; Ivanova, Pavlina T.; Milne, Stephen B.; Brown, H. Alex

    2012-01-01

    As technology expands what it is possible to accurately measure, so too the challenges faced by modern mass spectrometry applications expand. A high level of accuracy in lipid quantitation across thousands of chemical species simultaneously is demanded. While relative changes in lipid amounts with varying conditions may provide initial insights or point to novel targets, there are many questions that require determination of lipid analyte absolute quantitation. Glycerophospholipids present a significant challenge in this regard, given the headgroup diversity, large number of possible acyl chain combinations, and vast range of ionization efficiency of species. Lipidomic output is being used more often not just for profiling of the masses of species, but also for highly-targeted flux-based measurements which put additional burdens on the quantitation pipeline. These first two challenges bring into sharp focus the need for a robust lipidomics workflow including deisotoping, differentiation from background noise, use of multiple internal standards per lipid class, and the use of a scriptable environment in order to create maximum user flexibility and maintain metadata on the parameters of the data analysis as it occurs. As lipidomics technology develops and delivers more output on a larger number of analytes, so must the sophistication of statistical post-processing also continue to advance. High-dimensional data analysis methods involving clustering, lipid pathway analysis, and false discovery rate limitation are becoming standard practices in a maturing field. PMID:21683157

  4. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  5. Survivin -31 G/C polymorphism might contribute to colorectal cancer (CRC) risk: a meta-analysis.

    PubMed

    Yao, Linhua; Hu, Yi; Deng, Zhongmin; Li, Jingjing

    2015-01-01

    Published data has shown inconsistent findings about the association of survivin -31 G/C polymorphism with the risk of colorectal cancer (CRC). This meta-analysis quantitatively assesses the results from published studies to provide a more precise estimate of the association between survivin -31 G/C polymorphism as a possible predictor of the risk of CRC. We conducted a literature search in the PubMed, Web of Science, and Cochrane Library databases. Stata 12 software was used to calculate the pooled odds ratios (ORs) with 95% confidence intervals (CIs) based on the available data from each article. Six studies including 1840 cases with CRC and 1804 controls were included in this study. Survivin -31 G/C polymorphism was associated with a significantly increased risk of CRC (OR = 1.78; 95% CI, 1.53-2.07; I(2) = 0%). In the race subgroup analysis, both Asians (OR = 1.72; 95% CI, 1.44-2.05; I(2) = 0%) and Caucasians (OR = 1.93; 95% CI, 1.46-2.55; I(2) = 0%) with survivin -31 G/C polymorphism had increased CRC risk. In the subgroup analysis according to site of CRC, survivin -31 G/C polymorphism was not associated with colon cancer risk (OR = 2.02; 95% CI, 0.79-5.22; I(2) = 82%). However, this polymorphism was significantly associated with rectum cancer risk (OR = 1.98; 95% CI, 1.42-2.74; I(2) = 0%). In the subgroup analysis by clinical stage, both early stage (I+II) and advanced stage (III+IV) were associated with survivin -31 G/C polymorphism (OR = 1.61; 95% CI, 1.20-2.16; I(2) = 0% and OR = 2.30; 95% CI, 1.70-3.13; I(2) = 0%, respectively). In the subgroup analysis by smoke status, both smokers and non-smokers with survivin -31 G/C polymorphism showed increased CRC risk (OR = 1.47; 95% CI, 1.01-2.13; I(2) = 60% and OR = 1.71; 95% CI, 1.28-2.30; I(2) = 0%, respectively). In the subgroup analysis by drink status, both drinkers and non-drinkers with survivin -31 G/C polymorphism showed increased CRC risk (OR = 1.58; 95% CI, 1.06-2.37; I(2) = 8% and OR = 1.61; 95% CI, 1

  6. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  7. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  8. Bounding Analysis of Drinking Water Health Risks from a Spill of Hydraulic Fracturing Flowback Water.

    PubMed

    Rish, William R; Pfau, Edward J

    2018-04-01

    A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000-gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10 -10 and 10 -6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HI CUMULATIVE ) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty). © 2017 Society for Risk Analysis.

  9. ITS risk analysis.

    DOT National Transportation Integrated Search

    1996-06-01

    Risk analysis plays a key role in the implementation of an architecture. Early definition of the situations, processes, or events that have the potential for impeding the implementation of key elements of the ITS National Architecture is a critical e...

  10. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  12. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  13. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  14. Dietary Nitrates, Nitrites, and Nitrosamines Intake and the Risk of Gastric Cancer: A Meta-Analysis.

    PubMed

    Song, Peng; Wu, Lei; Guan, Wenxian

    2015-12-01

    The potential associations between dietary consumption of nitrates, nitrites, and nitrosamines and gastric cancer risk have been investigated by several studies, but yielded inconclusive results. We conducted a meta-analysis to provide a quantitative assessment of their relationships. Relevant articles were identified by a systematic literature searching of PubMed and Embase databases prior to August 2015. Random-effects models were employed to pool the relative risks. A total of 22 articles consisting of 49 studies-19 studies for nitrates, 19 studies for nitrites, and 11 studies for N-nitrosodimethylamine (NDMA)-were included. The summary relative risk of stomach cancer for the highest categories, compared with the lowest, was 0.80 (95% confidence interval (CI), 0.69-0.93) for dietary nitrates intake, 1.31 (95% CI, 1.13-1.52) for nitrites, and 1.34 (95% CI, 1.02-1.76) for NDMA (p for heterogeneity was 0.015, 0.013 and <0.001, respectively). The study type was found as the main source of heterogeneity for nitrates and nitrites. The heterogeneity for NDMA could not be eliminated completely through stratified analysis. Although significant associations were all observed in case-control studies, the cohort studies still showed a slight trend. The dose-response analysis indicated similar results as well. High nitrates intake was associated with a weak but statistically significant reduced risk of gastric cancer. Whereas increased consumption of nitrites and NDMA seemed to be risk factors for cancer. Due to the lack of uniformity for exposure assessment across studies, further prospective researches are warranted to verify these findings.

  15. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  16. A Systematic Literature Review and Meta-Regression Analysis on Early-Life Energy Restriction and Cancer Risk in Humans

    PubMed Central

    Elands, Rachel J. J.; Simons, Colinda C. J. M.; van Dongen, Martien; Schouten, Leo J.; Verhage, Bas A. J.; van den Brandt, Piet A.; Weijenberg, Matty P.

    2016-01-01

    Background In animal models, long-term moderate energy restriction (ER) is reported to decelerate carcinogenesis, whereas the effect of severe ER is inconsistent. The impact of early-life ER on cancer risk has never been reviewed systematically and quantitatively based on observational studies in humans. Objective We conducted a systematic review of observational studies and a meta-(regression) analysis on cohort studies to clarify the association between early-life ER and organ site-specific cancer risk. Methods PubMed and EMBASE (1982 –August 2015) were searched for observational studies. Summary relative risks (RRs) were estimated using a random effects model when available ≥3 studies. Results Twenty-four studies were included. Eleven publications, emanating from seven prospective cohort studies and some reporting on multiple cancer endpoints, met the inclusion criteria for quantitative analysis. Women exposed to early-life ER (ranging from 220–1660 kcal/day) had a higher breast cancer risk than those not exposed (RRRE all ages = 1.28, 95% CI: 1.05–1.56; RRRE for 10–20 years of age = 1.21, 95% CI: 1.09–1.34). Men exposed to early-life ER (ranging from 220–800kcal/day) had a higher prostate cancer risk than those not exposed (RRRE = 1.16, 95% CI: 1.03–1.30). Summary relative risks were not computed for colorectal cancer, because of heterogeneity, and for stomach-, pancreas-, ovarian-, and respiratory cancer because there were <3 available studies. Longer duration of exposure to ER, after adjustment for severity, was positively associated with overall cancer risk in women (p = 0.02). Ecological studies suggest that less severe ER is generally associated with a reduced risk of cancer. Conclusions Early-life transient severe ER seems to be associated with increased cancer risk in the breast (particularly ER exposure at adolescent age) and prostate. The duration, rather than severity of exposure to ER, seems to positively influence relative risk

  17. Epidemiological survey of quantitative ultrasound in risk assessment of falls in middle-aged and elderly people.

    PubMed

    Ou, Ling-Chun; Sun, Zih-Jie; Chang, Yin-Fan; Chang, Chin-Sung; Chao, Ting-Hsing; Kuo, Po-Hsiu; Lin, Ruey-Mo; Wu, Chih-Hsing

    2013-01-01

    The risk assessment of falls is important, but still unsatisfactory and time-consuming. Our objective was to assess quantitative ultrasound (QUS) in the risk assessment of falls. Our study was designed as epidemiological cross-sectional study occurring from March 2009 to February 2010 by community survey at a medical center. The participants were collected from systemic sample of 1,200 community-dwelling people (Male/Female = 524/676) 40 years old and over in Yunlin County, Mid-Taiwan. Structural questionnaires including socioeconomic status, living status, smoking and drinking habits, exercise and medical history were completed. Quantitative ultrasound (QUS) at the non-dominant distal radial area (QUS-R) and the left calcaneal area (QUS-C) were measured. The overall prevalence of falls was 19.8%. In men, the independently associated factors for falls were age (OR: 1.04; 95%CI: 1.01~1.06), fracture history (OR: 1.89; 95%CI: 1.12~3.19), osteoarthritis history (OR: 3.66; 95%CI: 1.15~11.64) and speed of sound (OR: 0.99; 95%CI: 0.99~1.00; p<0.05) by QUS-R. In women, the independently associated factors for falls were current drinking (OR: 3.54; 95%CI: 1.35∼9.31) and broadband ultrasound attenuation (OR: 0.98; 95%CI: 0.97~0.99; p<0.01) by QUS-C. The cutoffs at -2.5< T-score<-1 derived using QUS-R (OR: 2.85; 95%CI: 1.64~4.96; p<0.01) in men or T-score ≦-2.5 derived using QUS-C (OR: 2.72; 95%CI: 1.42~5.21; p<0.01) in women showed an independent association with falls. The lowest T-score derived using either QUS-R or QUS-C was also revealed as an independent factor for falls in both men (OR: 2.13; 95%CI: 1.03~4.43; p<0.05) and women (OR: 2.36; 95%CI: 1.13~4.91; p<0.05). Quantitative ultrasounds, measured either at the radial or calcaneal area, are convenient tools by which to assess the risk of falls in middle-aged and elderly people.

  18. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  19. Quantitative local analysis of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Topcu, Ufuk

    This thesis investigates quantitative methods for local robustness and performance analysis of nonlinear dynamical systems with polynomial vector fields. We propose measures to quantify systems' robustness against uncertainties in initial conditions (regions-of-attraction) and external disturbances (local reachability/gain analysis). S-procedure and sum-of-squares relaxations are used to translate Lyapunov-type characterizations to sum-of-squares optimization problems. These problems are typically bilinear/nonconvex (due to local analysis rather than global) and their size grows rapidly with state/uncertainty space dimension. Our approach is based on exploiting system theoretic interpretations of these optimization problems to reduce their complexity. We propose a methodology incorporating simulation data in formal proof construction enabling more reliable and efficient search for robustness and performance certificates compared to the direct use of general purpose solvers. This technique is adapted both to region-of-attraction and reachability analysis. We extend the analysis to uncertain systems by taking an intentionally simplistic and potentially conservative route, namely employing parameter-independent rather than parameter-dependent certificates. The conservatism is simply reduced by a branch-and-hound type refinement procedure. The main thrust of these methods is their suitability for parallel computing achieved by decomposing otherwise challenging problems into relatively tractable smaller ones. We demonstrate proposed methods on several small/medium size examples in each chapter and apply each method to a benchmark example with an uncertain short period pitch axis model of an aircraft. Additional practical issues leading to a more rigorous basis for the proposed methodology as well as promising further research topics are also addressed. We show that stability of linearized dynamics is not only necessary but also sufficient for the feasibility of the

  20. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    PubMed

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.