Sample records for risk analysis techniques

  1. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  2. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  3. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  4. A review of risk management process in construction projects of developing countries

    NASA Astrophysics Data System (ADS)

    Bahamid, R. A.; Doh, S. I.

    2017-11-01

    In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.

  5. Adapting risk management and computational intelligence network optimization techniques to improve traffic throughput and tail risk analysis.

    DOT National Transportation Integrated Search

    2014-04-01

    Risk management techniques are used to analyze fluctuations in uncontrollable variables and keep those fluctuations from impeding : the core function of a system or business. Examples of this are making sure that volatility in copper and aluminum pri...

  6. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  7. Methodological issues in volumetric magnetic resonance imaging of the brain in the Edinburgh High Risk Project.

    PubMed

    Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M

    1999-07-30

    The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.

  8. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  9. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  10. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  11. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  12. NEW APPROACHES IN RISK ANALYSIS OF ENVIRONMENTAL STRESSORS TO HUMAN AND ECOLOGICAL SYSTEMS

    EPA Science Inventory

    We explore the application of novel techniques for improving and integrating risk analysis of environmental stressors to human and ecological systems. Environmental protection decisions are guided by risk assessments serving as tools to develop regulatory policy and other relate...

  13. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review

    PubMed Central

    Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat

    2017-01-01

    Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688

  14. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr; Filleron, Thomas; Graff, Pierre

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-riskmore » and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.« less

  16. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  17. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  18. TH-EF-BRC-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  19. Risk Management Technique for design and operation of facilities and equipment

    NASA Technical Reports Server (NTRS)

    Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.

    1975-01-01

    The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.

  20. Use of the Generating Options for Active Risk Control (GO-ARC) Technique can lead to more robust risk control options.

    PubMed

    Card, Alan J; Simsekler, Mecit Can Emre; Clark, Michael; Ward, James R; Clarkson, P John

    2014-01-01

    Risk assessment is widely used to improve patient safety, but healthcare workers are not trained to design robust solutions to the risks they uncover. This leads to an overreliance on the weakest category of risk control recommendations: administrative controls. Increasing the proportion of non-administrative risk control options (NARCOs) generated would enable (though not ensure) the adoption of more robust solutions. Experimentally assess a method for generating stronger risk controls: The Generating Options for Active Risk Control (GO-ARC) Technique. Participants generated risk control options in response to two patient safety scenarios. Scenario 1 (baseline): All participants used current practice (unstructured brainstorming). Scenario 2: Control group used current practice; intervention group used the GO-ARC Technique. To control for individual differences between participants, analysis focused on the change in the proportion of NARCOs for each group. Proportion of NARCOs decreased from 0.18 at baseline to 0.12. Intervention group: Proportion increased from 0.10 at baseline to 0.29 using the GO-ARC Technique. Results were statistically significant. There was no decrease in the number of administrative controls generated by the intervention group. The Generating Options for Active Risk Control (GO-ARC) Technique appears to lead to more robust risk control options.

  1. TH-EF-BRC-04: Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yorke, E.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  2. TH-EF-BRC-00: TG-100 Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  3. TH-EF-BRC-02: FMEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  5. Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

    PubMed Central

    Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin

    2005-01-01

    The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156

  6. Expert systems in civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostem, C.N.; Maher, M.L.

    1986-01-01

    This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.

  7. Comparison of point intercept and image analysis for monitoring rangeland transects

    USDA-ARS?s Scientific Manuscript database

    Amidst increasing workloads and static or declining budgets, both public and private land management agencies face the need to adapt resource-monitoring techniques or risk falling behind on resource monitoring volume and quality with old techniques. Image analysis of nadir plot images, acquired with...

  8. Risk assessment techniques with applicability in marine engineering

    NASA Astrophysics Data System (ADS)

    Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.

    2015-11-01

    Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.

  9. A review on risk assessment techniques for hydraulic fracturing water and produced water management implemented in onshore unconventional oil and gas production.

    PubMed

    Torres, Luisa; Yadav, Om Prakash; Khan, Eakalak

    2016-01-01

    The objective of this paper is to review different risk assessment techniques applicable to onshore unconventional oil and gas production to determine the risks to water quantity and quality associated with hydraulic fracturing and produced water management. Water resources could be at risk without proper management of water, chemicals, and produced water. Previous risk assessments in the oil and gas industry were performed from an engineering perspective leaving aside important social factors. Different risk assessment methods and techniques are reviewed and summarized to select the most appropriate one to perform a holistic and integrated analysis of risks at every stage of the water life cycle. Constraints to performing risk assessment are identified including gaps in databases, which require more advanced techniques such as modeling. Discussions on each risk associated with water and produced water management, mitigation strategies, and future research direction are presented. Further research on risks in onshore unconventional oil and gas will benefit not only the U.S. but also other countries with shale oil and gas resources. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  11. The use of cluster analysis techniques in spaceflight project cost risk estimation

    NASA Technical Reports Server (NTRS)

    Fox, G.; Ebbeler, D.; Jorgensen, E.

    2003-01-01

    Project cost risk is the uncertainty in final project cost, contingent on initial budget, requirements and schedule. For a proposed mission, a dynamic simulation model relying for some of its input on a simple risk elicitation is used to identify and quantify systemic cost risk.

  12. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  13. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813

  14. Risk of urinary retention after nerve-sparing surgery for deep infiltrating endometriosis: A systematic review and meta-analysis.

    PubMed

    de Resende, José Anacleto Dutra; Cavalini, Luciana Tricai; Crispi, Claudio Peixoto; de Freitas Fonseca, Marlon

    2017-01-01

    Recently, nerve-sparing (NS) techniques have been incorporated in surgeries for deep infiltrating endometriosis (DIE) to prevent urinary complications. Our aim was to perform a systematic review and meta-analysis to assess the risk of urinary retention after NS surgery for DIE compared with classical (non-NS) techniques. Following the MOOSE guidelines for systematic reviews of observational studies, data were collected from published research articles that compared NS techniques with non-NS techniques in DIE surgery, with regard to post-operative urinary complications. randomized clinical trials, intervention or observational (cohort and case-control) studies assessing women who underwent surgery for painful DIE. cancer surgery and women submitted to bladder or ureteral resections. The respective relative risks (RR) and 95% confidence intervals (CI) were extracted and a forest plot was generated to show individual and combined estimates. Preliminarily, 1,270 potentially relevant studies were identified from which four studies were selected. A meta-analysis was performed to assess the risk of urinary retention at discharge and 90 days after surgery. We found a common RR of 0.19 [95%CI: 0.03-1.17; (I 2  = 50.20%; P = 0.09)] for need of self-catheterization at discharge in the NS group in relation to the conventional technique. Based on two studies, common RR for persistent urinary retention (after 90 days) was 0.16 [95%CI: 0.03-0.84]. Our results suggest significant advantages of the NS technique when considering the RR of persistent urinary retention. Controlled studies evaluating the best approach to manage the urinary tract after complex surgery for DIE are needed. Neurourol. Urodynam. 36:57-61, 2017. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  15. Endoscopic versus traditional saphenous vein harvesting: a prospective, randomized trial.

    PubMed

    Allen, K B; Griffith, G L; Heimansohn, D A; Robison, R J; Matheny, R G; Schier, J J; Fitzgerald, E B; Shaar, C J

    1998-07-01

    Saphenous vein harvested with a traditional longitudinal technique often results in leg wound complications. An alternative endoscopic harvest technique may decrease these complications. One hundred twelve patients scheduled for elective coronary artery bypass grafting were prospectively randomized to have vein harvested using either an endoscopic (group A, n = 54) or traditional technique (group B, n = 58). Groups A and B, respectively, were similar with regard to length of vein harvested (41 +/- 8 cm versus 40 +/- 14 cm), bypasses done (4.1 +/- 1.1 versus 4.2 +/- 1.4), age, preoperative risk stratification, and risks for wound complication (diabetes, sex, obesity, preoperative anemia, hypoalbuminemia, and peripheral vascular disease). Leg wound complications were significantly (p < or = 0.02) reduced in group A (4% [2 of 51] versus 19% [11 of 58]). Univariate analysis identified traditional incision (p < or = 0.02) and diabetes (p < or = 0.05) as wound complication risk factors. Multiple logistic regression analysis identified only the traditional harvest technique as a risk factor for leg wound complications with no significant interaction between harvest technique and any preoperative risk factor (p < or = 0.03). Harvest rate (0.9 +/- 0.4 cm/min versus 1.2 +/- 0.5 cm/min) was slower for group A (p < or = 0.02) and conversion from endoscopic to a traditional harvest occurred in 5.6% (3 of 54) of patients. In a prospective, randomized trial, saphenous vein harvested endoscopically was associated with fewer wound complications than the traditional longitudinal method.

  16. Cognitive mapping tools: review and risk management needs.

    PubMed

    Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor

    2012-08-01

    Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.

  17. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  18. INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT

    EPA Science Inventory

    A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...

  19. A new multicriteria risk mapping approach based on a multiattribute frontier concept

    Treesearch

    Denys Yemshanov; Frank H. Koch; Yakov Ben-Haim; Marla Downing; Frank Sapio; Marty Siltanen

    2013-01-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that...

  20. Risk Benefit Analysis of Health Promotion: Opportunities and Threats for Physical Education.

    ERIC Educational Resources Information Center

    Vertinsky, Patricia

    1985-01-01

    The increasing popularity of health promotion and lifestyle management techniques call for a careful look at the misuse and costs of suasion, imposition of values as science, social inequities and individual consequences, and biases in communication of health risk information. The application of more systematic cost-benefit analysis is…

  1. Modeling Payload Stowage Impacts on Fire Risks On-Board the International Space Station

    NASA Technical Reports Server (NTRS)

    Anton, Kellie e.; Brown, Patrick F.

    2010-01-01

    The purpose of this presentation is to determine the risks of fire on-board the ISS due to non-standard stowage. ISS stowage is constantly being reexamined for optimality. Non-standard stowage involves stowing items outside of rack drawers, and fire risk is a key concern and is heavily mitigated. A Methodology is needed to account for fire risk due to non-standard stowage to capture the risk. The contents include: 1) Fire Risk Background; 2) General Assumptions; 3) Modeling Techniques; 4) Event Sequence Diagram (ESD); 5) Qualitative Fire Analysis; 6) Sample Qualitative Results for Fire Risk; 7) Qualitative Stowage Analysis; 8) Sample Qualitative Results for Non-Standard Stowage; and 9) Quantitative Analysis Basic Event Data.

  2. The physical and empirical basis for a specific clear-air turbulence risk index

    NASA Technical Reports Server (NTRS)

    Keller, J. L.

    1985-01-01

    An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.

  3. [Bus drivers' biomechanical risk assessment in two different contexts].

    PubMed

    Baracco, A; Coggiola, M; Perrelli, F; Banchio, M; Martignone, S; Gullino, A; Romano, C

    2012-01-01

    The application of standardize methods for the biomechanical risk assessment in non-industrial cycled activity is not always possible. A typical case is the public transport sector, where workers complain of suffering for shoulder more than elbow and wrist pains. The Authors present the results of two studies involving two public transport companies and the risk of biomechanical overload of upper limbs for bus and tram drivers. The analysis has been made using three different approaches: focus groups; static analysis by using anthropometric manikins; work sampling technique by monitoring worker's activity and posture at each minute, for two hours and for each binomial vehicle-route, considering P5F e P95M drivers and assessing the perceived efforts thorough the Borg's CR10 Scale. The conclusive results show that the ergonomic analysis managed by multiple non-standardized techniques may reach consistent and repeatable results according to the epidemiological evidences.

  4. Risk Management for Weapon Systems Acquisition: A Decision Support System

    DTIC Science & Technology

    1985-02-28

    includes the program evaluation and review technique (PERT) for network analysis, the PMRM for quantifying risk , an optimization package for generating...Despite the inclusion of uncertainty in time, PERT can at best be considered as a tool for quantifying risk with regard to the time element only. Moreover

  5. Use of direct versus indirect preparation data for assessing risk associated with airborne exposures at asbestos-contaminated sites.

    PubMed

    Goldade, Mary Patricia; O'Brien, Wendy Pott

    2014-01-01

    At asbestos-contaminated sites, exposure assessment requires measurement of airborne asbestos concentrations; however, the choice of preparation steps employed in the analysis has been debated vigorously among members of the asbestos exposure and risk assessment communities for many years. This study finds that the choice of preparation technique used in estimating airborne amphibole asbestos exposures for risk assessment is generally not a significant source of uncertainty. Conventionally, the indirect preparation method has been less preferred by some because it is purported to result in false elevations in airborne asbestos concentrations, when compared to direct analysis of air filters. However, airborne asbestos sampling in non-occupational settings is challenging because non-asbestos particles can interfere with the asbestos measurements, sometimes necessitating analysis via indirect preparation. To evaluate whether exposure concentrations derived from direct versus indirect preparation techniques differed significantly, paired measurements of airborne Libby-type amphibole, prepared using both techniques, were compared. For the evaluation, 31 paired direct and indirect preparations originating from the same air filters were analyzed for Libby-type amphibole using transmission electron microscopy. On average, the total Libby-type amphibole airborne exposure concentration was 3.3 times higher for indirect preparation analysis than for its paired direct preparation analysis (standard deviation = 4.1), a difference which is not statistically significant (p = 0.12, two-tailed, Wilcoxon signed rank test). The results suggest that the magnitude of the difference may be larger for shorter particles. Overall, neither preparation technique (direct or indirect) preferentially generates more precise and unbiased data for airborne Libby-type amphibole concentration estimates. The indirect preparation method is reasonable for estimating Libby-type amphibole exposure and may be necessary given the challenges of sampling in environmental settings. Relative to the larger context of uncertainties inherent in the risk assessment process, uncertainties associated with the use of airborne Libby-type amphibole exposure measurements derived from indirect preparation analysis are low. Use of exposure measurements generated by either direct or indirect preparation analyses is reasonable to estimate Libby-type Amphibole exposures in a risk assessment.

  6. SU-F-T-248: FMEA Risk Analysis Implementation (AAPM TG-100) in Total Skin Electron Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    2016-06-15

    Purpose: Total Skin Electron Irradiation (TSEI) is a radiotherapy treatment which involves irradiating the entire body surface as homogeneously as possible. It is composed of an extensive multi-step technique in which quality management requires high consumption of resources and a fluid communication between the involved staff, necessary to improve the safety of treatment. The TG-100 proposes a new perspective of quality management in radiotherapy, presenting a systematic method of risk analysis throughout the global flow of the stages through the patient. The purpose of this work has been to apply TG-100 approach to the TSEI procedure in our institution. Methods:more » A multidisciplinary team specifically targeting TSEI procedure was formed, that met regularly and jointly developed the process map (PM), following TG-100 guidelines of the AAPM. This PM is a visual representation of the temporal flow of steps through the patient since start until the end of his stay in the radiotherapy service. Results: This is the first stage of the full risk analysis, which is being carried out in the center. The PM provides an overview of the process and facilitates the understanding of the team members who will participate in the subsequent analysis. Currently, the team is implementing the analysis of failure modes and effects (FMEA). The failure modes of each of the steps have been identified and assessors are assigning a value of severity (S), frequency of occurrence (O) and lack of detection (D) individually. To our knowledge, this is the first PM made for the TSEI. The developed PM can be useful for those centers that intend to implement the TSEI technique. Conclusion: The PM of TSEI technique has been established, as the first stage of full risk analysis, performed in a reference center in this treatment.« less

  7. Managing industrial risk--having a tested and proven system to prevent and assess risk.

    PubMed

    Heller, Stephen

    2006-03-17

    Some relatively easy techniques exist to improve the risk picture/profile to aid in preventing losses. Today with the advent of computer system resources, focusing on specific aspects of risk through systematic scoring and comparison, the risk analysis can be relatively easy to achieve. Techniques like these demonstrate how working experience and common sense can be combined mathematically into a flexible risk management tool or risk model for analyzing risk. The risk assessment methodology provided by companies today is no longer the ideas and practices of one group or even one company. It is reflective of the practice of many companies, as well as the ideas and expertise of academia and government regulators. The use of multi-criteria decision making (MCDM) techniques for making critical decisions has been recognized for many years for a variety of purposes. In today's computer age, the easy accessing and user-friendly nature for using these techniques, makes them a favorable choice for use in the risk assessment environment. The new user of these methodologies should find many ideas directly applicable to his or her needs when approaching risk decision making. The user should find their ideas readily adapted, with slight modification, to accurately reflect a specific situation using MCDM techniques. This makes them an attractive feature for use in assessment and risk modeling. The main advantage of decision making techniques, such as MCDM, is that in the early stages of a risk assessment, accurate data on industrial risk, and failures are lacking. In most cases, it is still insufficient to perform a thorough risk assessment using purely statistical concepts. The practical advantages towards deviating from strict data-driven protocol seem to outweigh the drawbacks. Industry failure data often comes at a high cost when a loss occurs. We can benefit from this unfortunate acquisition of data through the continuous refining of our decisions by incorporating this new information into our assessments. MCDM techniques offer flexibility in accessing comparison within broad data sets to reflect our best estimation of their importance towards contribution to the risk picture. This allows for the accurate determination of the more probable and more consequential issues. This can later be refined using more intensive risk techniques and the avoidance of less critical issues.

  8. Safer Liquid Natural Gas

    NASA Technical Reports Server (NTRS)

    1976-01-01

    After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).

  9. Risk-benefit analysis and public policy: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less

  10. The role of failure modes and effects analysis in showing the benefits of automation in the blood bank.

    PubMed

    Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew

    2013-05-01

    Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.

  11. Arsenic health risk assessment in drinking water and source apportionment using multivariate statistical techniques in Kohistan region, northern Pakistan.

    PubMed

    Muhammad, Said; Tahir Shah, M; Khan, Sardar

    2010-10-01

    The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were <1 in rest of the samples. This level of contamination should have low chronic risk and medium cancer risk when compared with US EPA guidelines. Furthermore, the inter-dependence of physio-chemical parameters and pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Interim reliability evaluation program, Browns Ferry 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1981-01-01

    Probabilistic risk analysis techniques, i.e., event tree and fault tree analysis, were utilized to provide a risk assessment of the Browns Ferry Nuclear Plant Unit 1. Browns Ferry 1 is a General Electric boiling water reactor of the BWR 4 product line with a Mark 1 (drywell and torus) containment. Within the guidelines of the IREP Procedure and Schedule Guide, dominant accident sequences that contribute to public health and safety risks were identified and grouped according to release categories.

  13. The influence of hand positions on biomechanical injury risk factors at the wrist joint during the round-off skills in female gymnastics.

    PubMed

    Farana, Roman; Jandacka, Daniel; Uchytil, Jaroslav; Zahradnik, David; Irwin, Gareth

    2017-01-01

    The aim of this study was to examine the biomechanical injury risk factors at the wrist, including joint kinetics, kinematics and stiffness in the first and second contact limb for parallel and T-shape round-off (RO) techniques. Seven international-level female gymnasts performed 10 trials of the RO to back handspring with parallel and T-shape hand positions. Synchronised kinematic (3D motion analysis system; 247 Hz) and kinetic (two force plates; 1235 Hz) data were collected for each trial. A two-way repeated measure analysis of variance (ANOVA) assessed differences in the kinematic and kinetic parameters between the techniques for each contact limb. The main findings highlighted that in both the RO techniques, the second contact limb wrist joint is exposed to higher mechanical loads than the first contact limb demonstrated by increased axial compression force and loading rate. In the parallel technique, the second contact limb wrist joint is exposed to higher axial compression load. Differences between wrist joint kinetics highlight that the T-shape technique may potentially lead to reducing these bio-physical loads and consequently protect the second contact limb wrist joint from overload and biological failure. Highlighting the biomechanical risk factors facilitates the process of technique selection making more objective and safe.

  14. Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E.D.

    This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medicalmore » system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.« less

  15. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  16. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    NASA Astrophysics Data System (ADS)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  17. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  18. A dynamical systems model for nuclear power plant risk

    NASA Astrophysics Data System (ADS)

    Hess, Stephen Michael

    The recent transition to an open access generation marketplace has forced nuclear plant operators to become much more cost conscious and focused on plant performance. Coincidentally, the regulatory perspective also is in a state of transition from a command and control framework to one that is risk-informed and performance-based. Due to these structural changes in the economics and regulatory system associated with commercial nuclear power plant operation, there is an increased need for plant management to explicitly manage nuclear safety risk. Application of probabilistic risk assessment techniques to model plant hardware has provided a significant contribution to understanding the potential initiating events and equipment failures that can lead to core damage accidents. Application of the lessons learned from these analyses has supported improved plant operation and safety over the previous decade. However, this analytical approach has not been nearly as successful in addressing the impact of plant processes and management effectiveness on the risks of plant operation. Thus, the research described in this dissertation presents a different approach to address this issue. Here we propose a dynamical model that describes the interaction of important plant processes among themselves and their overall impact on nuclear safety risk. We first provide a review of the techniques that are applied in a conventional probabilistic risk assessment of commercially operating nuclear power plants and summarize the typical results obtained. The limitations of the conventional approach and the status of research previously performed to address these limitations also are presented. Next, we present the case for the application of an alternative approach using dynamical systems theory. This includes a discussion of previous applications of dynamical models to study other important socio-economic issues. Next, we review the analytical techniques that are applicable to analysis of these models. Details of the development of the mathematical risk model are presented. This includes discussion of the processes included in the model and the identification of significant interprocess interactions. This is followed by analysis of the model that demonstrates that its dynamical evolution displays characteristics that have been observed at commercially operating plants. The model is analyzed using the previously described techniques from dynamical systems theory. From this analysis, several significant insights are obtained with respect to the effective control of nuclear safety risk. Finally, we present conclusions and recommendations for further research.

  19. FMEA: a model for reducing medical errors.

    PubMed

    Chiozza, Maria Laura; Ponzetti, Clemente

    2009-06-01

    Patient safety is a management issue, in view of the fact that clinical risk management has become an important part of hospital management. Failure Mode and Effect Analysis (FMEA) is a proactive technique for error detection and reduction, firstly introduced within the aerospace industry in the 1960s. Early applications in the health care industry dating back to the 1990s included critical systems in the development and manufacture of drugs and in the prevention of medication errors in hospitals. In 2008, the Technical Committee of the International Organization for Standardization (ISO), licensed a technical specification for medical laboratories suggesting FMEA as a method for prospective risk analysis of high-risk processes. Here we describe the main steps of the FMEA process and review data available on the application of this technique to laboratory medicine. A significant reduction of the risk priority number (RPN) was obtained when applying FMEA to blood cross-matching, to clinical chemistry analytes, as well as to point-of-care testing (POCT).

  20. Femoral head necrosis: A finite element analysis of common and novel surgical techniques.

    PubMed

    Cilla, Myriam; Checa, Sara; Preininger, Bernd; Winkler, Tobias; Perka, Carsten; Duda, Georg N; Pumberger, Matthias

    2017-10-01

    Femoral head necrosis is a common cause of secondary osteoarthritis. At the early stages, treatment strategies are normally based on core decompression techniques, where the number, location and diameter of the drilling holes varies depending on the selected approach. The purpose of this study was to investigate the risk of femoral head, neck and subtrochanteric fracture following six different core decompression techniques. Five common and a newly proposed techniques were analyzed in respect to their biomechanical consequences using finite element analysis. The geometry of a femur was reconstructed from computed-tomography images. Thereafter, the drilling configurations were simulated. The strains in the intact and drilled femurs were determined under physiological, patient-specific, muscle and joint contact forces. The following results were observed: i) - an increase in collapse and fracture risk of the femur head by disease progression ii) - for a single hole approach at the subtrochanteric region, the fracture risk increases with the diameter iii) - the highest fracture risks occur for an 8mm single hole drilling at the subtrochanteric region and approaches with multiple drilling at various entry points iv) - the proposed novel approach resulted in the most physiological strains (closer to the experienced by the healthy bone). Our results suggest that all common core decompression methods have a significant impact on the biomechanical competence of the proximal femur and impact its mechanical potential. Fracture risk increases with drilling diameter and multiple drilling with smaller diameter. We recommend the anterior approach due to its reduced soft tissue trauma and its biomechanical performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  2. A comparative critical study between FMEA and FTA risk analysis methods

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Constantinescu, DM

    2017-10-01

    Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.

  3. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  4. SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harry, T; University of California, San Diego, La Jolla, CA; Manger, R

    Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this workmore » was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.« less

  5. Risk Factors of Catheter-Related Thrombosis (CRT) in Cancer Patients: A Patient-Level Data (IPD) Meta-Analysis of Clinical Trials and Prospective Studies

    PubMed Central

    Saber, W.; Moua, T.; Williams, E. C.; Verso, M.; Agnelli, G.; Couban, S.; Young, A.; De Cicco, M.; Biffi, R.; van Rooden, C. J.; Huisman, M. V.; Fagnani, D.; Cimminiello, C.; Moia, M.; Magagnoli, M.; Povoski, S. P.; Malak, S. F.; Lee, A. Y.

    2010-01-01

    Background Knowledge of independent, baseline risk factors of catheter-related thrombosis (CRT) may help select adult cancer patients at high risk to receive thromboprophylaxis. Objectives We conducted a meta-analysis of individual patient-level data to identify these baseline risk factors. Patients/Methods MEDLINE, EMBASE, CINAHL, CENTRAL, DARE, Grey literature databases were searched in all languages from 1995-2008. Prospective studies and randomized controlled trials (RCTs) were eligible. Studies were included if original patient-level data were provided by the investigators and if CRT was objectively confirmed with valid imaging. Multivariate logistic regression analysis of 17 prespecified baseline characteristics was conducted. Adjusted odds ratios (OR) and 95% confidence intervals (CI) were estimated. Results A total sample of 5636 subjects from 5 RCTs and 7 prospective studies was included in the analysis. Among these subjects, 425 CRT events were observed. In multivariate logistic regression, the use of implanted ports as compared with peripherally implanted central venous catheters (PICC), decreased CRT risk (OR = 0.43; 95% CI, 0.23-0.80), whereas past history of deep vein thrombosis (DVT) (OR = 2.03; 95% CI, 1.05-3.92), subclavian venipuncture insertion technique (OR = 2.16; 95% CI, 1.07-4.34), and improper catheter tip location (OR = 1.92; 95% CI, 1.22-3.02), increased CRT risk. Conclusions CRT risk is increased with using PICC catheters, previous history of DVT, subclavian venipuncture insertion technique and improper positioning of the catheter tip. These factors may be useful for risk stratifying patients to select those for thromboprophylaxis. Prospective studies are needed to validate these findings. PMID:21040443

  6. Mapping Infected Area after a Flash-Flooding Storm Using Multi Criteria Analysis and Spectral Indices

    NASA Astrophysics Data System (ADS)

    Al-Akad, S.; Akensous, Y.; Hakdaoui, M.

    2017-11-01

    This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.

  7. Conceptual Launch Vehicle and Spacecraft Design for Risk Assessment

    NASA Technical Reports Server (NTRS)

    Motiwala, Samira A.; Mathias, Donovan L.; Mattenberger, Christopher J.

    2014-01-01

    One of the most challenging aspects of developing human space launch and exploration systems is minimizing and mitigating the many potential risk factors to ensure the safest possible design while also meeting the required cost, weight, and performance criteria. In order to accomplish this, effective risk analyses and trade studies are needed to identify key risk drivers, dependencies, and sensitivities as the design evolves. The Engineering Risk Assessment (ERA) team at NASA Ames Research Center (ARC) develops advanced risk analysis approaches, models, and tools to provide such meaningful risk and reliability data throughout vehicle development. The goal of the project presented in this memorandum is to design a generic launch 7 vehicle and spacecraft architecture that can be used to develop and demonstrate these new risk analysis techniques without relying on other proprietary or sensitive vehicle designs. To accomplish this, initial spacecraft and launch vehicle (LV) designs were established using historical sizing relationships for a mission delivering four crewmembers and equipment to the International Space Station (ISS). Mass-estimating relationships (MERs) were used to size the crew capsule and launch vehicle, and a combination of optimization techniques and iterative design processes were employed to determine a possible two-stage-to-orbit (TSTO) launch trajectory into a 350-kilometer orbit. Primary subsystems were also designed for the crewed capsule architecture, based on a 24-hour on-orbit mission with a 7-day contingency. Safety analysis was also performed to identify major risks to crew survivability and assess the system's overall reliability. These procedures and analyses validate that the architecture's basic design and performance are reasonable to be used for risk trade studies. While the vehicle designs presented are not intended to represent a viable architecture, they will provide a valuable initial platform for developing and demonstrating innovative risk assessment capabilities.

  8. Logistic regression for risk factor modelling in stuttering research.

    PubMed

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian

    In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less

  10. Assessing social and economic effects of perceived risk: Workshop summary: Draft: BWIP Repository Project. [Basalt Waste Isolation Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nealey, S.M.; Liebow, E.B.

    1988-03-01

    The US Department of Energy sponsored a one-day workshop to discuss the complex dimensions of risk judgment formation and the assessment of social and economic effects of risk perceptions related to the permanent underground storage of highly radioactive waste from commercial nuclear power plants. Affected parties have publicly expressed concerns about potentially significant risk-related effects of this approach to waste management. A selective review of relevant literature in psychology, decision analysis, economics, sociology, and anthropology was completed, along with an examination of decision analysis techniques that might assist in developing suitable responses to public risk-related concerns. The workshop was organizedmore » as a forum in which a set of distinguished experts could exchange ideas and observations about the problems of characterizing the effects of risk judgments. Out of the exchange emerged the issues or themes of problems with probabilistic risk assessment techniques are evident; differences exist in the way experts and laypersons view risk, and this leads to higher levels of public concern than experts feel are justified; experts, risk managers, and decision-makers sometimes err in assessing risk and in dealing with the public; credibility and trust are important contributing factors in the formation of risk judgments; social and economic consequences of perceived risk should be properly anticipated; improvements can be made in informing the public about risk; the role of the public in risk assessment, risk management and decisions about risk should be reconsidered; and mitigation and compensation are central to resolving conflicts arising from divergent risk judgments. 1 tab.« less

  11. Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'

    DOT National Transportation Integrated Search

    2001-02-01

    This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...

  12. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  13. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers

    PubMed Central

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-01-01

    Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144

  14. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers.

    PubMed

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-11-26

    Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.

  15. Fractal mechanisms and heart rate dynamics. Long-range correlations and their breakdown with disease

    NASA Technical Reports Server (NTRS)

    Peng, C. K.; Havlin, S.; Hausdorff, J. M.; Mietus, J. E.; Stanley, H. E.; Goldberger, A. L.

    1995-01-01

    Under healthy conditions, the normal cardiac (sinus) interbeat interval fluctuates in a complex manner. Quantitative analysis using techniques adapted from statistical physics reveals the presence of long-range power-law correlations extending over thousands of heartbeats. This scale-invariant (fractal) behavior suggests that the regulatory system generating these fluctuations is operating far from equilibrium. In contrast, it is found that for subjects at high risk of sudden death (e.g., congestive heart failure patients), these long-range correlations break down. Application of fractal scaling analysis and related techniques provides new approaches to assessing cardiac risk and forecasting sudden cardiac death, as well as motivating development of novel physiologic models of systems that appear to be heterodynamic rather than homeostatic.

  16. Health Risk-Based Assessment and Management of Heavy Metals-Contaminated Soil Sites in Taiwan

    PubMed Central

    Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang

    2010-01-01

    Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan. PMID:21139851

  17. Design and statistical problems in prevention.

    PubMed

    Gullberg, B

    1996-01-01

    Clinical and epidemiological research in osteoporosis can benefit from using the methods and techniques established in the area of chronic disease epidemiology. However, attention has to be given to the special characteristics such as the multifactorial nature and the fact that the subjects usually are of high ages. In order to evaluate prevention it is of course first necessary to detect and confirm reversible risk factors. The advantage and disadvantage of different design (cross-sectional, cohort and case-control) are well known. The effects of avoidable biases, e.g. selection, observation and confounding have to be balanced against practical conveniences like time, expenses, recruitment etc. The translation of relative risks into population attributable risks (etiologic fractions, prevented fractions) are complex and are usually performed under unrealistic, simplified assumptions. The consequences of interactions (synergy) between risk factors are often neglected. The multifactorial structure requires application of more advanced multi-level statistical techniques. The common strategy in prevention to target a cluster of risk factors in order to avoid the multifactorial nature implies that in the end it is impossible to separate each unique factor. Experimental designs for evaluating prevention like clinical trials and intervention have to take into account the distinction between explanatory and pragmatic studies. An explanatory approach is similar to an idealized laboratory trial while the pragmatic design is more realistic, practical and has a general public health perspective. The statistical techniques to be used in osteoporosis research are implemented in easy available computer-packages like SAS, SPSS, BMDP and GLIM. In addition to the traditional logistic regression methods like Cox analysis and Poisson regression also analysis of repeated measurement and cluster analysis are relevant.

  18. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  19. WE-B-BRC-02: Risk Analysis and Incident Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fraass, B.

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less

  20. Holistic environmental assessment and offshore oil field exploration and production.

    PubMed

    Salter, E; Ford, J

    2001-01-01

    According to UK Government surveys, concern for the environment is growing. Environmental regulation of the industry is becoming wider in its scope and tougher in its implementation. Various techniques are available to assess how the industry can drive down its environmental impact and comply with environmental regulation. Environmental Assessments (EA) required by European law do not cover the whole life cycle of the project that they are analysing. Life Cycle Analysis (LCA) was developed to assess the environmental loadings of a product, process or activity over its entire life cycle. It was the first technique used in environmental analysis that adopted what was described as a holistic approach. It fails this approach by not assessing accidental emissions or environmental impacts other than those that are direct. Cost Benefit Analysis (CBA) offers the opportunity to value environmental effects and appraise a project on the basis of costs and benefits. Not all environmental effects can be valued and of those that can there is considerable uncertainty in their valuation and occurrence. CBA cannot satisfactorily measure the total environmental risk of a project. Consequently there is a need for a technique that overcomes the failures of project-level EA, LCA and CBA, and assesses total environmental risk. Many organizations such as, the British Medical Association, the European Oilfield Speciality Chemicals Association, the Royal Ministry of Petroleum and Energy (Norway) and Shell Expro now recognize that a holistic approach is an integral part of assessing total risk. The Brent SPAR case study highlights the interdisciplinary nature required of any environmental analysis. Holistic Environmental Assessment is recommended as such an environmental analysis.

  1. Analysis of indoor air pollutants checklist using environmetric technique for health risk assessment of sick building complaint in nonindustrial workplace

    PubMed Central

    Syazwan, AI; Rafee, B Mohd; Juahir, Hafizan; Azman, AZF; Nizar, AM; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, AA; Yunos, MA Syafiq; Anita, AR; Hanafiah, J Muhamad; Shaharuddin, MS; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, MN Mohamad; Azizan, HS; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, FT

    2012-01-01

    Purpose To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. Design A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. Method A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Result Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. Conclusion This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures. PMID:23055779

  2. Analysis of indoor air pollutants checklist using environmetric technique for health risk assessment of sick building complaint in nonindustrial workplace.

    PubMed

    Syazwan, Ai; Rafee, B Mohd; Juahir, Hafizan; Azman, Azf; Nizar, Am; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, Aa; Yunos, Ma Syafiq; Anita, Ar; Hanafiah, J Muhamad; Shaharuddin, Ms; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, Mn Mohamad; Azizan, Hs; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, Ft

    2012-01-01

    To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures.

  3. Accurate Diabetes Risk Stratification Using Machine Learning: Role of Missing Value and Outliers.

    PubMed

    Maniruzzaman, Md; Rahman, Md Jahanur; Al-MehediHasan, Md; Suri, Harman S; Abedin, Md Menhazul; El-Baz, Ayman; Suri, Jasjit S

    2018-04-10

    Diabetes mellitus is a group of metabolic diseases in which blood sugar levels are too high. About 8.8% of the world was diabetic in 2017. It is projected that this will reach nearly 10% by 2045. The major challenge is that when machine learning-based classifiers are applied to such data sets for risk stratification, leads to lower performance. Thus, our objective is to develop an optimized and robust machine learning (ML) system under the assumption that missing values or outliers if replaced by a median configuration will yield higher risk stratification accuracy. This ML-based risk stratification is designed, optimized and evaluated, where: (i) the features are extracted and optimized from the six feature selection techniques (random forest, logistic regression, mutual information, principal component analysis, analysis of variance, and Fisher discriminant ratio) and combined with ten different types of classifiers (linear discriminant analysis, quadratic discriminant analysis, naïve Bayes, Gaussian process classification, support vector machine, artificial neural network, Adaboost, logistic regression, decision tree, and random forest) under the hypothesis that both missing values and outliers when replaced by computed medians will improve the risk stratification accuracy. Pima Indian diabetic dataset (768 patients: 268 diabetic and 500 controls) was used. Our results demonstrate that on replacing the missing values and outliers by group median and median values, respectively and further using the combination of random forest feature selection and random forest classification technique yields an accuracy, sensitivity, specificity, positive predictive value, negative predictive value and area under the curve as: 92.26%, 95.96%, 79.72%, 91.14%, 91.20%, and 0.93, respectively. This is an improvement of 10% over previously developed techniques published in literature. The system was validated for its stability and reliability. RF-based model showed the best performance when outliers are replaced by median values.

  4. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  5. Hyperspectral remote sensing and GIS techniques application for the evaluation and monitoring of interactions between natural risks and industrial hazards

    NASA Astrophysics Data System (ADS)

    Marino, Alessandra; Ludovisi, Giancarlo; Moccaldi, Antonio; Damiani, Fiorenzo

    2001-02-01

    The aim of this paper is to outline the potential of imaging spectroscopy and GIS techniques as tool for the management of data rich environments, as complex fluvial areas, exposed to geological, geomorphological, and hydrogeological risks. The area of study, the Pescara River Basin is characterized by the presence of important industrial sites and by the occurrence of floods, landslides and seismic events. Data were collected, during a specific flight, using an hyperspectral MIVIS sensor. Images have been processed in order to obtain updated and accurate land-cover and land-use maps that have been inserted in a specific GIS database and integrated with further information like lithology, geological structure, geomorphology, hydrogeological features, productive plants location and characters. The processing of data layers was performed, using a dedicated software, through typical GIS operators like indexing, recording, matrix analysis, proximity analysis. The interactions between natural risks, industrial installations, agricultural areas, water resources and urban settlements have been analyzed. This allowed the creation and processing of thematic layers like vulnerability, risk and impact maps.

  6. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  7. Low pacemaker incidence with continuous-sutured valves: a retrospective analysis.

    PubMed

    Niclauss, Lars; Delay, Dominique; Pfister, Raymond; Colombier, Sebastien; Kirsch, Matthias; Prêtre, René

    2017-06-01

    Background Permanent pacemaker implantation after surgical aortic valve replacement depends on patient selection and risk factors for conduction disorders. We aimed to identify risk criteria and obtain a selected group comparable to patients assigned to transcatheter aortic valve implantation. Methods Isolated sutured aortic valve replacements in 994 patients treated from 2007 to 2015 were reviewed. Demographics, hospital stay, preexisting conduction disorders, surgical technique, and etiology in patients with and without permanent pacemaker implantation were compared. Reported outcomes after transcatheter aortic valve implantation were compared with those of a subgroup including only degenerative valve disease and first redo. Results The incidence of permanent pacemaker implantation was 2.9%. Longer hospital stay ( p = 0.01), preexisting rhythm disorders ( p < 0.001), complex prosthetic endocarditis ( p = 0.01), and complex redo ( p < 0.001) were associated with permanent pacemaker implantation. Although prostheses were sutured with continuous monofilament in the majority of cases (86%), interrupted pledgetted sutures were used more often in the pacemaker group ( p = 0.002). In the subgroup analysis, the incidence of permanent pacemaker implantation was 2%; preexisting rhythm disorders and the suture technique were still major risk factors. Conclusion Permanent pacemaker implantation depends on etiology, preexisting rhythm disorders, and suture technique, and the 2% incidence compares favorably with the reported 5- to 10-fold higher incidence after transcatheter aortic valve implantation. Cost analysis should take this into account. Often dismissed as minor complication, permanent pacemaker implantation increases the risks of endocarditis, impaired myocardial recovery, and higher mortality if associated with prosthesis regurgitation.

  8. Method and system for assigning a confidence metric for automated determination of optic disc location

    DOEpatents

    Karnowski, Thomas P [Knoxville, TN; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya [Knoxville, TN; Chaum, Edward [Memphis, TN

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  9. RIPARIAN CHARACTERIZATION USING SUB-PIXEL ANALYSIS OF LANDSAT TM IMAGERY FOR USE IN ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    Landuse/land cover and riparian corridor characterization for 7 major watersheds in western Ohio was accomplished using sub-pixel analysis and traditional classification techniques. Areas
    representing forest, woodland, shrub, and herbaceous vegetation were delineated using a ...

  10. A quality risk management model approach for cell therapy manufacturing.

    PubMed

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed. © 2010 Society for Risk Analysis.

  11. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  12. Applying the change vector analysis technique to assess the desertification risk in the south-west of Romania in the period 1984-2011.

    PubMed

    Vorovencii, Iosif

    2017-09-26

    The desertification risk affects around 40% of the agricultural land in various regions of Romania. The purpose of this study is to analyse the risk of desertification in the south-west of Romania in the period 1984-2011 using the change vector analysis (CVA) technique and Landsat thematic mapper (TM) satellite images. CVA was applied to combinations of normalised difference vegetation index (NDVI)-albedo, NDVI-bare soil index (BI) and tasselled cap greenness (TCG)-tasselled cap brightness (TCB). The combination NDVI-albedo proved to be the best in assessing the desertification risk, with an overall accuracy of 87.67%, identifying a desertification risk on 25.16% of the studied period. The classification of the maps was performed for the following classes: desertification risk, re-growing and persistence. Four degrees of desertification risk and re-growing were used: low, medium, high and extreme. Using the combination NDVI-albedo, 0.53% of the analysed surface was assessed as having an extreme degree of desertification risk, 3.93% a high degree, 8.72% a medium degree and 11.98% a low degree. The driving forces behind the risk of desertification are both anthropogenic and climatic causes. The anthropogenic causes include the destruction of the irrigation system, deforestation, the destruction of the forest shelterbelts, the fragmentation of agricultural land and its inefficient management. Climatic causes refer to increase of temperatures, frequent and prolonged droughts and decline of the amount of precipitation.

  13. Risk management in the competitive electric power industry

    NASA Astrophysics Data System (ADS)

    Dahlgren, Robert William

    From 1990 until present day, the electric power industry has experienced dramatic changes worldwide. This recent evolution of the power industry has included creation and multiple iterations of competitive wholesale markets in many different forms. The creation of these competitive markets has resulted in increased short-term volatility of power prices. Vertically integrated utilities emerged from years of regulatory controls to now experience the need to perform risk assessment. The goal of this dissertation is to provide background and details of the evolution of market structures combined with examples of how to apply price risk assessment techniques such as Value-at-Risk (VaR). In Chapter 1, the history and evolution of three selected regional markets, PJM, California, and England and Wales is presented. A summary of the commonalities and differences is presented to provide an overview of the rate of transformation of the industry in recent years. The broad area of risk management in the power industry is also explored through a State-of-the-Art Literature Survey. In Chapter 2, an illustration of risk assessment to power trading is presented. The techniques of Value-at-Risk and Conditional Value-at-Risk are introduced and applied to a common scenario. The advantages and limitations of the techniques are compared through observation of their results against the common example. Volatility in the California Power Markets is presented in Chapter 3. This analysis explores the California markets in the summer of 2000 including the application of VaR analysis to the extreme volatility observed during this period. In Chapter 4, CVaR is applied to the same California historical data used in Chapter 3. In addition, the unique application of minimizing the risk of a power portfolio by minimizing CVaR is presented. The application relies on recent research into CVaR whereby the portfolio optimization problem can be reduced to a Linear Programming problem.

  14. Beyond FMEA: the structured what-if technique (SWIFT).

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2012-01-01

    Although it is probably the best-known prospective hazard analysis (PHA) tool, failure mode and effects analysis (FMEA) is far from the only option available. This article introduces one of the alternatives: The structured what-if technique (SWIFT). SWIFT is a flexible, high-level risk identification technique that can be used on a stand-alone basis, or as part of a staged approach to make more efficient use of bottom-up methods like FMEA. In this article we describe the method, assess the evidence related to its use in healthcare with the use of a systematic literature review, and suggest ways in which it could be better adapted for use in the healthcare industry. Based on the limited evidence available, it appears that healthcare workers find it easy to learn, easy to use, and credible. Especially when used as part of a staged approach, SWIFT appears capable of playing a useful role as component of the PHA armamentarium. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.

  15. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  16. Empirical performance of interpolation techniques in risk-neutral density (RND) estimation

    NASA Astrophysics Data System (ADS)

    Bahaludin, H.; Abdullah, M. H.

    2017-03-01

    The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.

  17. Monitoring Socio-Demographic Risk: A Cohort Analysis of Families Using Census Micro-Data

    ERIC Educational Resources Information Center

    Davis, Peter; McPherson, Mervyl; Wheldon, Mark; von Randow, Martin

    2012-01-01

    We apply cohort techniques to monitor four indicators of socio-demographic risk crucial to family wellbeing; namely, income, employment, education, and housing. The data were derived from New Zealand's five-yearly Census for the period 1981-2006. This allowed us to track birth cohorts of mothers (and their families) over six successive New Zealand…

  18. Online Graduate Teacher Education: Establishing an EKG for Student Success Intervention

    ERIC Educational Resources Information Center

    Shelton, Brett E.; Hung, Jui-Long; Baughman, Sarah

    2016-01-01

    Predicting which students enrolled in graduate online education are at-risk for failure is an arduous yet important task for teachers and administrators alike. This research reports on a statistical analysis technique using both static and dynamic variables to determine which students are at-risk and when an intervention could be most helpful…

  19. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  20. Panel Discussion: New Directions in Human Reliability Analysis for Oil & Gas, Cybersecurity, Nuclear, and Aviation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harold S. Blackman; Ronald Boring; Julie L. Marble

    This panel will discuss what new directions are necessary to maximize the usefulness of HRA techniques across different areas of application. HRA has long been a part of Probabilistic Risk Assessment in the nuclear industry as it offers a superior standard for risk-based decision-making. These techniques are continuing to be adopted by other industries including oil & gas, cybersecurity, nuclear, and aviation. Each participant will present his or her ideas concerning industry needs followed by a discussion about what research is needed and the necessity to achieve cross industry collaboration.

  1. Relative risk analysis in regulating the use of radiation-emitting medical devices. A preliminary application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E.D.; Banks, W.W.; Altenbach, T.J.

    1995-09-01

    This report describes a preliminary application of an analysis approach for assessing relative risks in the use of radiation- emitting medical devices. Results are presented on human-initiated actions and failure modes that are most likely to occur in the use of the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step in a US Nuclear Regulatory Commission (NRC) plan to evaluate the potential role of risk analysis in regulating the use of nuclear medical devices. For this preliminary application of risk assessment, the focus was to develop a basic process using existing techniques for identifying themore » most likely risk contributors and their relative importance. The approach taken developed relative risk rankings and profiles that incorporated the type and quality of data available and could present results in an easily understood form. This work was performed by the Lawrence Livermore National Laboratory for the NRC.« less

  2. Evaluation of the Risk Factors for a Rotator Cuff Retear After Repair Surgery.

    PubMed

    Lee, Yeong Seok; Jeong, Jeung Yeol; Park, Chan-Deok; Kang, Seung Gyoon; Yoo, Jae Chul

    2017-07-01

    A retear is a significant clinical problem after rotator cuff repair. However, no study has evaluated the retear rate with regard to the extent of footprint coverage. To evaluate the preoperative and intraoperative factors for a retear after rotator cuff repair, and to confirm the relationship with the extent of footprint coverage. Cohort study; Level of evidence, 3. Data were retrospectively collected from 693 patients who underwent arthroscopic rotator cuff repair between January 2006 and December 2014. All repairs were classified into 4 types of completeness of repair according to the amount of footprint coverage at the end of surgery. All patients underwent magnetic resonance imaging (MRI) after a mean postoperative duration of 5.4 months. Preoperative demographic data, functional scores, range of motion, and global fatty degeneration on preoperative MRI and intraoperative variables including the tear size, completeness of rotator cuff repair, concomitant subscapularis repair, number of suture anchors used, repair technique (single-row or transosseous-equivalent double-row repair), and surgical duration were evaluated. Furthermore, the factors associated with failure using the single-row technique and transosseous-equivalent double-row technique were analyzed separately. The retear rate was 7.22%. Univariate analysis revealed that rotator cuff retears were affected by age; the presence of inflammatory arthritis; the completeness of rotator cuff repair; the initial tear size; the number of suture anchors; mean operative time; functional visual analog scale scores; Simple Shoulder Test findings; American Shoulder and Elbow Surgeons scores; and fatty degeneration of the supraspinatus, infraspinatus, and subscapularis. Multivariate logistic regression analysis revealed patient age, initial tear size, and fatty degeneration of the supraspinatus as independent risk factors for a rotator cuff retear. Multivariate logistic regression analysis of the single-row group revealed patient age and fatty degeneration of the supraspinatus as independent risk factors for a rotator cuff retear. Multivariate logistic regression analysis of the transosseous-equivalent double-row group revealed a frozen shoulder as an independent risk factor for a rotator cuff retear. Our results suggest that patient age, initial tear size, and fatty degeneration of the supraspinatus are independent risk factors for a rotator cuff retear, whereas the completeness of rotator cuff repair based on the extent of footprint coverage and repair technique are not.

  3. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819

  4. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  5. Application of multi-criteria decision-making to risk prioritisation in tidal energy developments

    NASA Astrophysics Data System (ADS)

    Kolios, Athanasios; Read, George; Ioannou, Anastasia

    2016-01-01

    This paper presents an analytical multi-criterion analysis for the prioritisation of risks for the development of tidal energy projects. After a basic identification of risks throughout the project and relevant stakeholders in the UK, classified through a political, economic, social, technological, legal and environmental analysis, relevant questionnaires provided scores to each risk and corresponding weights for each of the different sectors. Employing an extended technique for order of preference by similarity to ideal solution as well as the weighted sum method based on the data obtained, the risks identified are ranked based on their criticality, drawing attention of the industry in mitigating the ones scoring higher. Both methods were modified to take averages at different stages of the analysis in order to observe the effects on the final risk ranking. A sensitivity analysis of the results was also carried out with regard to the weighting factors given to the perceived expertise of participants, with different results being obtained whether a linear, squared or square root regression is used. Results of the study show that academics and industry have conflicting opinions with regard to the perception of the most critical risks.

  6. Fuzzy risk analysis of a modern γ-ray industrial irradiator.

    PubMed

    Castiglia, F; Giardina, M

    2011-06-01

    Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.

  7. WE-B-BRC-01: Current Methodologies in Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rath, F.

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less

  8. WE-B-BRC-03: Risk in the Context of Medical Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samei, E.

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less

  9. WE-B-BRC-00: Concepts in Risk-Based Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less

  10. Introduction to Flight Test Engineering (Introduction aux techniques des essais en vol)

    DTIC Science & Technology

    2005-07-01

    or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis with dedicated software for: • Signal...density Fast Fourier Transform Transfer function analysis Frequency response analysis Etc. PRESENTATION Color/black & white Display screen...envelope by operating the airplane at increasing ranges - representing increasing risk - of engine operation, airspeeds both fast and slow, altitude

  11. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    NASA Astrophysics Data System (ADS)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mossahebi, S; Feigenberg, S; Nichols, E

    Purpose: GammaPod™, the first stereotactic radiotherapy device for early stage breast cancer treatment, has been recently installed and commissioned at our institution. A multidisciplinary working group applied the failure mode and effects analysis (FMEA) approach to perform a risk analysis. Methods: FMEA was applied to the GammaPod™ treatment process by: 1) generating process maps for each stage of treatment; 2) identifying potential failure modes and outlining their causes and effects; 3) scoring the potential failure modes using the risk priority number (RPN) system based on the product of severity, frequency of occurrence, and detectability (ranging 1–10). An RPN of highermore » than 150 was set as the threshold for minimal concern of risk. For these high-risk failure modes, potential quality assurance procedures and risk control techniques have been proposed. A new set of severity, occurrence, and detectability values were re-assessed in presence of the suggested mitigation strategies. Results: In the single-day image-and-treat workflow, 19, 22, and 27 sub-processes were identified for the stages of simulation, treatment planning, and delivery processes, respectively. During the simulation stage, 38 potential failure modes were found and scored, in terms of RPN, in the range of 9-392. 34 potential failure modes were analyzed in treatment planning with a score range of 16-200. For the treatment delivery stage, 47 potential failure modes were found with an RPN score range of 16-392. The most critical failure modes consisted of breast-cup pressure loss and incorrect target localization due to patient upper-body alignment inaccuracies. The final RPN score of these failure modes based on recommended actions were assessed to be below 150. Conclusion: FMEA risk analysis technique was applied to the treatment process of GammaPod™, a new stereotactic radiotherapy technology. Application of systematic risk analysis methods is projected to lead to improved quality of GammaPod™ treatments. Ying Niu and Cedric Yu are affiliated with Xcision Medical Systems.« less

  13. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  14. Long-Term Evaluation of Changes in Operative Technique and Hardware-Related Complications With Deep Brain Stimulation.

    PubMed

    Falowski, Steven M; Ooi, Yinn Cher; Bakay, Roy A E

    2015-12-01

    Deep brain stimulation is the most frequent neurosurgical procedure for movement disorders. While this elective procedure carries a low-risk profile, it is not free of complications. As a new procedure, the pattern of complications changed with experience and modification of surgical technique and equipment. This review analyzes the most common hardware-related complications that may occur and techniques to avoid them. It is a retrospective review of 432 patients undergoing 1077 procedures over a 14-year period by one surgeon with emphasis on the analysis of surgical technique and the changes over time. Comparisons were made pre and postimplementation of different surgical techniques over different time periods. The epochs relate to the learning curve, new equipment, and new techniques. Overall lead revision was observed at 5.7%, extension revision at 3.2%, infection rate at 1.2%, infarct without intracerebral hemorrhage at 0.8%, and intracerebral hemorrhage at 2.5% with a permanent deficit of 0.2%. An analysis and change in surgical technique which involved isolating the lead from the skin surface at both the cranial and retro-auricular incision also demonstrated a substantial decrease in lead fracture rate and infection rate. There was no mortality. This large series of patients and long-term follow-up demonstrates that risks are very low in comparison with other neurosurgical procedures, but DBS is still an elective procedure that necessitates extensive care and precision. In a rapidly evolving field, attention to surgical technique is imperative and will keep rates of complications at a minimum. © 2015 International Neuromodulation Society.

  15. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System

    PubMed Central

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie

    2017-01-01

    Background Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Methods Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. Results A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20–65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10–2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10–3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53–2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Conclusions Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD. PMID:28056058

  16. Lower Education Level Is a Risk Factor for Peritonitis and Technique Failure but Not a Risk for Overall Mortality in Peritoneal Dialysis under Comprehensive Training System.

    PubMed

    Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie; Oh, Kook-Hwan

    2017-01-01

    Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (912, n = 324). Outcomes were analyzed using Cox proportional hazards models and competing risk regression. A total of 655 incident PD patients (60.9% male, age 48.4±14.1 years) were analyzed. During follow-up for 41 (interquartile range, 20-65) months, 255 patients (38.9%) experienced more than one episode of peritonitis, 138 patients (21.1%) underwent technique failure, and 78 patients (11.9%) died. After adjustment, middle school or lower education group was an independent risk factor for peritonitis (adjusted hazard ratio [HR], 1.61; 95% confidence interval [CI], 1.10-2.36; P = 0.015) and technique failure (adjusted HR, 1.87; 95% CI, 1.10-3.18; P = 0.038), compared with higher than high school education group. However, lower education was not associated with increased mortality either by as-treated (adjusted HR, 1.11; 95% CI, 0.53-2.33; P = 0.788) or intent-to-treat analysis (P = 0.726). Although lower education was a significant risk factor for peritonitis and technique failure, it was not associated with increased mortality in PD patients. Comprehensive training and multidisciplinary education may overcome the lower education level in PD.

  17. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  18. New arrows in the quiver for targeting care management: high-risk versus high-opportunity case identification.

    PubMed

    Bernstein, Richard H

    2007-01-01

    "Care management" purposefully obscures the distinctions between disease and case management and stresses their common features: action in the present to prevent adverse future outcomes and costs. It includes identifying a high-need population by referrals, screening, or data analysis, assessing those likely to benefit from interventions, intervening, evaluating the intervention, and adjusting interventions when needed. High-risk individuals can be identified using at least 9 techniques, from referrals and questionnaires to retrospective claims analysis and predictive models. Other than referrals, software based on the risk-adjustment methodology that we have adapted can incorporate all these methodologies. Because the risk adjustment employs extensive case mix and severity adjustment, it provides care managers with 3 innovative ways to identify not only high-risk individuals but also high-opportunity cases.

  19. Decision strategies to reduce teenage and young adult deaths in the United States.

    PubMed

    Keeney, Ralph L; Palley, Asa B

    2013-09-01

    This article uses decision analysis concepts and techniques to address an extremely important problem to any family with children, namely, how to avoid the tragic death of a child during the high-risk ages of 15-24. Descriptively, our analysis indicates that of the 35,000 annual deaths among this age group in the United States, approximately 20,000 could be avoided if individuals chose readily available alternatives for decisions relating to these deaths. Prescriptively, we develop a decision framework for parents and a child to both identify and proactively pursue decisions that can lower that child's exposure to life-threatening risks and positively alter decisions when facing such risks. Applying this framework for parents and the youth themselves, we illustrate the logic and process of generating proactive alternatives with numerous examples that each could pursue to lower these life-threatening risks and possibly avoid a tragic premature death, and discuss some public policy implications of our findings. © 2013 Society for Risk Analysis.

  20. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  1. Validating a Local Failure Risk Stratification for Use in Prospective Studies of Adjuvant Radiation Therapy for Bladder Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumann, Brian C.; He, Jiwei; Hwang, Wei-Ting

    Purpose: To inform prospective trials of adjuvant radiation therapy (adj-RT) for bladder cancer after radical cystectomy, a locoregional failure (LF) risk stratification was proposed. This stratification was developed and validated using surgical databases that may not reflect the outcomes expected in prospective trials. Our purpose was to assess sources of bias that may affect the stratification model's validity or alter the LF risk estimates for each subgroup: time bias due to evolving surgical techniques; trial accrual bias due to inclusion of patients who would be ineligible for adj-RT trials because of early disease progression, death, or loss to follow-up shortlymore » after cystectomy; bias due to different statistical methods to estimate LF; and subgrouping bias due to different definitions of the LF subgroups. Methods and Materials: The LF risk stratification was developed using a single-institution cohort (n=442, 1990-2008) and the multi-institutional SWOG 8710 cohort (n=264, 1987-1998) treated with radical cystectomy with or without chemotherapy. We evaluated the sensitivity of the stratification to sources of bias using Fine-Gray regression and Kaplan-Meier analyses. Results: Year of radical cystectomy was not associated with LF risk on univariate or multivariate analysis after controlling for risk group. By use of more stringent inclusion criteria, 26 SWOG patients (10%) and 60 patients from the single-institution cohort (14%) were excluded. Analysis of the remaining patients confirmed 3 subgroups with significantly different LF risks with 3-year rates of 7%, 17%, and 36%, respectively (P<.01), nearly identical to the rates without correcting for trial accrual bias. Kaplan-Meier techniques estimated higher subgroup LF rates than competing risk analysis. The subgroup definitions used in the NRG-GU001 adj-RT trial were validated. Conclusions: These sources of bias did not invalidate the LF risk stratification or substantially change the model's LF estimates.« less

  2. Inferior or double joint spaces injection versus superior joint space injection for temporomandibular disorders: a systematic review and meta-analysis.

    PubMed

    Li, Chunjie; Zhang, Yifan; Lv, Jun; Shi, Zongdao

    2012-01-01

    To compare the effect and safety of inferior or double temporomandibular joint spaces drug injection versus superior temporomandibular joint space injection in the treatment of temporomandibular disorders. MEDLINE (via Ovid, 1948 to March 2011), CENTRAL (Issue 1, 2011), Embase (1984 to March 2011), CBM (1978 to March 2011), and World Health Organization International Clinical Trials Registry Platform were searched electronically; relevant journals as well as references of included studies were hand-searched for randomized controlled trials comparing effect or safety of inferior or double joint spaces drug injection technique with those of superior space injection technique. Risk of bias assessment with the tool recommended by Cochrane Collaboration, reporting quality assessment with CONSORT and data extraction, were carried out independently by 2 reviewers. Meta-analysis was delivered with RevMan 5.0.23. Four trials with 349 participants were included. All the included studies had moderate risk of bias. Meta-analysis showed that inferior or double spaces injection technique could significantly increase 2.88 mm more maximal mouth opening (P = .0001) and alleviate pain intensity in the temporomandibular area on average by 9.01 mm visual analog scale scores (P = .0001) compared with superior space injection technique, but could not markedly change synthesized clinical index (P = .05) in the short term; nevertheless, they showed more beneficial maximal mouth opening (P = .002), pain relief (P < .0001), and synthesized clinical variable (P < .0001) in the long term than superior space injection. No serious adverse events were reported. Inferior or double temporomandibular joint spaces drug injection technique shows better effect than superior space injection technique, and their safety is affirmative. However, more high-quality studies are still needed to test and verify the evidence. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  3. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  4. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  5. Jump Landing Characteristics Predict Lower Extremity Injuries in Indoor Team Sports.

    PubMed

    van der Does, H T D; Brink, M S; Benjaminse, A; Visscher, C; Lemmink, K A P M

    2016-03-01

    The aim of this study is to investigate the predictive value of landing stability and technique to gain insight into risk factors for ankle and knee injuries in indoor team sport players. Seventy-five male and female basketball, volleyball or korfball players were screened by measuring landing stability after a single-leg jump landing and landing technique during a repeated counter movement jump by detailed 3-dimensional kinematics and kinetics. During the season 11 acute ankle injuries were reported along with 6 acute and 7 overuse knee injuries by the teams' physical therapist. Logistic regression analysis showed less landing stability in the forward and diagonal jump direction (OR 1.01-1.10, p≤0.05) in players who sustained an acute ankle injury. Furthermore landing technique with a greater ankle dorsiflexion moment increased the risk for acute ankle injury (OR 2.16, p≤0.05). A smaller knee flexion moment and greater vertical ground reaction force increased the risk of an overuse knee injury (OR 0.29 and 1.13 respectively, p≤0.05). Less one-legged landing stability and suboptimal landing technique were shown in players sustaining an acute ankle and overuse knee injury compared to healthy players. Determining both landing stability and technique may further guide injury prevention programs. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Industrial and occupational ergonomics in the petrochemical process industry: a regression trees approach.

    PubMed

    Bevilacqua, M; Ciarapica, F E; Giacchetta, G

    2008-07-01

    This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.

  7. LOX, GOX and Pressure Relief

    NASA Technical Reports Server (NTRS)

    McLeod, Ken; Stoltzfus, Joel

    2006-01-01

    Oxygen relief systems present a serious fire hazard risk with often severe consequences. This presentation offers a risk management solution strategy which encourages minimizing ignition hazards, maximizing best materials, and utilizing good practices. Additionally, the relief system should be designed for cleanability and ballistic flow. The use of the right metals, softgoods, and lubricants, along with the best assembly techniques, is stressed. Materials should also be tested if data is not available and a full hazard analysis should be conducted in an effort to minimize risk and harm.

  8. Development of More Cost-Effective Methods for Long-Term Monitoring of Soil Vapor Intrusion to Indoor Air Using Quantitative Passive Diffusive-Adsorptive Sampling Techniques

    DTIC Science & Technology

    2015-05-01

    challenging component of assessing human health risks associated with contaminated soil and groundwater since the late 1990s, during which time...and analysis. 1.3 REGULATORY DRIVERS Regulatory guidance for assessment and management of risks associated with VI has been issued by at least 27...requirements to assess potential human health risks , and this possibility exists where VOCs are present in the subsurface near occupied buildings

  9. Minimum risk wavelet shrinkage operator for Poisson image denoising.

    PubMed

    Cheng, Wu; Hirakawa, Keigo

    2015-05-01

    The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.

  10. The new Zero-P implant can effectively reduce the risk of postoperative dysphagia and complications compared with the traditional anterior cage and plate: a systematic review and meta-analysis.

    PubMed

    Yin, Mengchen; Ma, Junming; Huang, Quan; Xia, Ye; Shen, Qixing; Zhao, Chenglong; Tao, Jun; Chen, Ni; Yu, Zhingxing; Ye, Jie; Mo, Wen; Xiao, Jianru

    2016-10-18

    The low-profile angle-stable spacer Zero-P is a new kind of cervical fusion system that is claimed to limit the potential drawbacks and complications. The purpose of this meta-analysis was to compare the clinical and radiological results of the new Zero-P implant with those of the traditional anterior cage and plate in the treatment of symptomatic cervical spondylosis, and provides clinicians with evidence on which to base their clinical decision making. The following electronic databases were searched: PMedline, PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, Evidence Based Medicine Reviews, VIP, and CNKI. Conference posters and abstracts were also electronically searched. The efficacy was evaluated in intraoperative time, intraoperative blood loss, fusion rate and dysphagia. For intraoperative time and intraoperative blood loss, the meta-analysis revealed that the Zero-P surgical technique is not superior to the cage and plate technique . For fusion rate, the two techniques both had good bone fusion, however, this difference is not statistically significant. For decrease of JOA and dysphagia, the pooled data showed that the Zero-P surgical technique is superior to the cage and plate technique. Zero-P interbody fusion can attain good clinical efficacy and a satisfactory fusion rate in the treatment of symptomatic cervical spondylosis. It also can effectively reduce the risk of postoperative dysphagia and its complications. However, owing to the lack of long-term follow-up, its long-term efficacy remains unknown.

  11. Systems Architectures for a Tactical Naval Command and Control System

    DTIC Science & Technology

    2009-03-01

    Supplement TST Time-sensitive Targeting TTP Tactics, Techniques, and Procedures WTP Weapons-target pairing xix GLOSSARY Analysis...target pairings ( WTPs ) and are presented to OTC [a]. 24. OTC conducts risk assessment of engagement options [a]. 25. OTC orders confirmed surface...engagement options are generated through weapon- target pairings ( WTPs ) and are presented to OTC [a]. 24. OTC conducts risk assessment of engagement

  12. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  13. Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea

    2015-09-01

    The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less

  14. USEPA PATHOGEN EQUIVALENCY COMMITTEE RETREAT

    EPA Science Inventory

    The Pathogen Equivalency Committee held its retreat from September 20-21, 2005 at Hueston Woods State Park in College Corner, Ohio. This presentation will update the PEC’s membership on emerging pathogens, analytical methods, disinfection techniques, risk analysis, preparat...

  15. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  16. The effect of technique change on knee loads during sidestep cutting.

    PubMed

    Dempsey, Alasdair R; Lloyd, David G; Elliott, Bruce C; Steele, Julie R; Munro, Bridget J; Russo, Kylie A

    2007-10-01

    To identify the effect of modifying sidestep cutting technique on knee loads and predict what impact such change would have on the risk of noncontact anterior cruciate ligament injury. A force platform and motion-analysis system were used to record ground-reaction forces and track the trajectories of markers on 15 healthy males performing sidestep cutting tasks using their normal technique and nine different imposed techniques. A kinematic and inverse dynamic model was used to calculate the three-dimensional knee postures and moments. The imposed techniques of foot wide and torso leaning in the opposite direction to the cut resulted in increased peak valgus moments experienced in weight acceptance. Higher peak internal rotation moments were found for the foot wide and torso rotation in the opposite direction to the cut techniques. The foot rotated in technique resulted in lower mean flexion/extension moments, whereas the foot wide condition resulted in higher mean flexion/extension moments. The flexed knee, torso rotated in the opposite direction to the cut and torso leaning in the same direction as the cut techniques had significantly more knee flexion at heel strike. Sidestep cutting technique had a significant effect on loads experienced at the knee. The techniques that produced higher valgus and internal rotation moments at the knee, such as foot wide, torso leaning in the opposite direction to the cut and torso rotating in the opposite direction to the cut, may place an athlete at higher risk of injury because these knee loads have been shown to increase the strain on the anterior cruciate ligament. Training athletes to avoid such body positions may result in a reduced risk of noncontact anterior cruciate ligament injures.

  17. Maternal infection rates after cesarean delivery by Pfannenstiel or Joel-Cohen incision: a multicenter surveillance study.

    PubMed

    Dumas, Anne Marie; Girard, Raphaële; Ayzac, Louis; Caillat-Vallet, Emmanuelle; Tissot-Guerraz, Françoise; Vincent-Bouletreau, Agnès; Berland, Michel

    2009-12-01

    Our purpose was to evaluate maternal nosocomial infection rates according to the incision technique used for caesarean delivery, in a routine surveillance study. This was a prospective study of 5123 cesarean deliveries (43.2% Joel-Cohen, 56.8% Pfannenstiel incisions) in 35 maternity units (Mater Sud Est network). Data on routine surveillance variables, operative duration, and three additional variables (manual removal of the placenta, uterine exteriorization, and/or cleaning of the parieto-colic gutter) were collected. Multiple logistic regression analysis was used to identify independent risk factors for infection. The overall nosocomial infection and endometritis rates were higher for the Joel-Cohen than Pfannenstiel incision (4.5% vs. 3.3%, 0.8% vs. 0.3%, respectively). The higher rate of nosocomial infections with the Joel-Cohen incision was due to a greater proportion of patients presenting risk factors (i.e., emergency delivery, primary cesarean, blood loss > or =800 mL, no manual removal of the placenta and no uterine exteriorization). However, the Joel-Cohen technique was an independent risk factor for endometritis. The Joel-Cohen technique is faster than the Pfannenstiel technique but is associated with a higher incidence of endometritis.

  18. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.

  19. Decision modeling for fire incident analysis

    Treesearch

    Donald G. MacGregor; Armando González-Cabán

    2009-01-01

    This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...

  20. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  1. ASSESSMENT OF DYNAMIC PRA TECHNIQUES WITH INDUSTRY AVERAGE COMPONENT PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yadav, Vaibhav; Agarwal, Vivek; Gribok, Andrei V.

    In the nuclear industry, risk monitors are intended to provide a point-in-time estimate of the system risk given the current plant configuration. Current risk monitors are limited in that they do not properly take into account the deteriorating states of plant equipment, which are unit-specific. Current approaches to computing risk monitors use probabilistic risk assessment (PRA) techniques, but the assessment is typically a snapshot in time. Living PRA models attempt to address limitations of traditional PRA models in a limited sense by including temporary changes in plant and system configurations. However, information on plant component health are not considered. Thismore » often leaves risk monitors using living PRA models incapable of conducting evaluations with dynamic degradation scenarios evolving over time. There is a need to develop enabling approaches to solidify risk monitors to provide time and condition-dependent risk by integrating traditional PRA models with condition monitoring and prognostic techniques. This paper presents estimation of system risk evolution over time by integrating plant risk monitoring data with dynamic PRA methods incorporating aging and degradation. Several online, non-destructive approaches have been developed for diagnosing plant component conditions in nuclear industry, i.e., condition indication index, using vibration analysis, current signatures, and operational history [1]. In this work the component performance measures at U.S. commercial nuclear power plants (NPP) [2] are incorporated within the various dynamic PRA methodologies [3] to provide better estimates of probability of failures. Aging and degradation is modeled within the Level-1 PRA framework and is applied to several failure modes of pumps and can be extended to a range of components, viz. valves, generators, batteries, and pipes.« less

  2. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  3. The association of placenta previa and assisted reproductive techniques: a meta-analysis.

    PubMed

    Karami, Manoochehr; Jenabi, Ensiyeh; Fereidooni, Bita

    2018-07-01

    Several epidemiological studies have determined that assisted reproductive techniques (ART) can increase the risk of placenta previa. To date, only a meta-analysis has been performed for assessing the relationship between placenta previa and ART. This meta-analysis was conducted to estimate the association between placenta previa and ART in singleton and twin pregnancies. A literature search was performed in major databases PubMed, Web of Science, and Scopus from the earliest possible year to April 2017. The heterogeneity across studies was explored by Q-test and I 2 statistic. The publication bias was assessed using Begg's and Egger's tests. The results were reported using odds ratio (OR) and relative risk (RR) estimates with its 95% confidence intervals (CI) using a random-effects model. The literature search yielded 1529 publications until September 2016 with 1,388,592 participants. The overall estimate of OR was 2.67 (95%CI: 2.01, 3.34) and RR was 3.62 (95%CI: 0.21, 7.03) based on singleton pregnancies. The overall estimate of OR was 1.50 (95%CI: 1.26, 1.74) based on twin pregnancies. We showed based on odds ratio reports in observational studies that ART procedures are a risk factor for placenta previa.

  4. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    PubMed

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A platform for proactive, risk-based slope asset management, phase II.

    DOT National Transportation Integrated Search

    2015-03-01

    The lidar visualization technique developed by this project enables highway managers to understand changes in slope characteristics : along highways. This change detection and analysis can be the basis of informed decisions for slope inspection and r...

  6. A platform for proactive, risk-based slope asset management, phase II.

    DOT National Transportation Integrated Search

    2015-08-01

    The lidar visualization technique developed by this project enables highway managers to understand changes : in slope characteristics along highways. This change detection and analysis can be the basis of informed : decisions for slope inspection and...

  7. Long-term follow-up results of umbilical hernia repair.

    PubMed

    Venclauskas, Linas; Jokubauskas, Mantas; Zilinskas, Justas; Zviniene, Kristina; Kiudelis, Mindaugas

    2017-12-01

    Multiple suture techniques and various mesh repairs are used in open or laparoscopic umbilical hernia (UH) surgery. To compare long-term follow-up results of UH repair in different hernia surgery groups and to identify risk factors for UH recurrence. A retrospective analysis of 216 patients who underwent elective surgery for UH during a 10-year period was performed. The patients were divided into three groups according to surgery technique (suture, mesh and laparoscopic repair). Early and long-term follow-up results including hospital stay, postoperative general and wound complications, recurrence rate and postoperative patient complaints were reviewed. Risk factors for recurrence were also analyzed. One hundred and forty-six patients were operated on using suture repair, 52 using open mesh and 18 using laparoscopic repair technique. 77.8% of patients underwent long-term follow-up. The postoperative wound complication rate and long-term postoperative complaints were significantly higher in the open mesh repair group. The overall hernia recurrence rate was 13.1%. Only 2 (1.7%) patients with small hernias (< 2 cm) had a recurrence in the suture repair group. Logistic regression analysis showed that body mass index (BMI) > 30 kg/m 2 , diabetes and wound infection were independent risk factors for umbilical hernia recurrence. The overall umbilical hernia recurrence rate was 13.1%. Body mass index > 30 kg/m 2 , diabetes and wound infection were independent risk factors for UH recurrence. According to our study results, laparoscopic medium and large umbilical hernia repair has slight advantages over open mesh repair concerning early postoperative complications, long-term postoperative pain and recurrence.

  8. Report of geomagnetic pulsation indices for space weather applications

    USGS Publications Warehouse

    Xu, Z.; Gannon, Jennifer L.; Rigler, Erin J.

    2013-01-01

    The phenomenon of ultra-low frequency geomagnetic pulsations was first observed in the ground-based measurements of the 1859 Carrington Event and has been studied for over 100 years. Pulsation frequency is considered to be “ultra” low when it is lower than the natural frequencies of the plasma, such as the ion gyrofrequency. Ultra-low frequency pulsations are considered a source of noise in some geophysical analysis techniques, such as aeromagnetic surveys and transient electromagnetics, so it is critical to develop near real-time space weather products to monitor these geomagnetic pulsations. The proper spectral analysis of magnetometer data, such as using wavelet analysis techniques, can also be important to Geomagnetically Induced Current risk assessment.

  9. A Framework for Assessment of Aviation Safety Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  10. Hollow-fiber flow field-flow fractionation and multi-angle light scattering investigation of the size, shape and metal-release of silver nanoparticles in aqueous medium for nano-risk assessment.

    PubMed

    Marassi, Valentina; Casolari, Sonia; Roda, Barbara; Zattoni, Andrea; Reschiglian, Pierluigi; Panzavolta, Silvia; Tofail, Syed A M; Ortelli, Simona; Delpivo, Camilla; Blosi, Magda; Costa, Anna Luisa

    2015-03-15

    Due to the increased use of silver nanoparticles in industrial scale manufacturing, consumer products and nanomedicine reliable measurements of properties such as the size, shape and distribution of these nano particles in aqueous medium is critical. These properties indeed affect both functional properties and biological impacts especially in quantifying associated risks and identifying suitable risk-mediation strategies. The feasibility of on-line coupling of a fractionation technique such as hollow-fiber flow field flow fractionation (HF5) with a light scattering technique such as MALS (multi-angle light scattering) is investigated here for this purpose. Data obtained from such a fractionation technique and its combination thereof with MALS have been compared with those from more conventional but often complementary techniques e.g. transmission electron microscopy, dynamic light scattering, atomic absorption spectroscopy, and X-ray fluorescence. The combination of fractionation and multi angle light scattering techniques have been found to offer an ideal, hyphenated methodology for a simultaneous size-separation and characterization of silver nanoparticles. The hydrodynamic radii determined by fractionation techniques can be conveniently correlated to the mean average diameters determined by multi angle light scattering and reliable information on particle morphology in aqueous dispersion has been obtained. The ability to separate silver (Ag(+)) ions from silver nanoparticles (AgNPs) via membrane filtration during size analysis is an added advantage in obtaining quantitative insights to its risk potential. Most importantly, the methodology developed in this article can potentially be extended to similar characterization of metal-based nanoparticles when studying their functional effectiveness and hazard potential. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Space shuttle/payload interface analysis. Volume 4: Business Risk and Value of Operations in Space (BRAVO). Part 1: Summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Background information is provided which emphasizes the philosophy behind analytical techniques used in the business risk and value of operations in space (BRAVO) study. The focus of the summary is on the general approach, operation of the procedures, and the status of the study. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.

  12. A Spatiotemporal Analysis of Extreme Heat Vulnerability Across the United States using Geospatial Techniques

    NASA Astrophysics Data System (ADS)

    Schoessow, F. S.; Li, Y.; Howe, P. D.

    2016-12-01

    Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.

  13. Lead in a Baltimore shipyard.

    PubMed

    Hall, Francis X

    2006-12-01

    The goal was to monitor the effectiveness of the Coast Guard Yard's lead program by comparing a shipyard period in 1991 to one in 2002-2003. Comparisons of airborne lead levels by paint removal techniques, airborne lead levels by welding techniques, and blood lead levels of workers were evaluated by chi2 analysis. Airborne lead levels in paint removal techniques decreased over time for all methods used. Airborne lead levels in welding techniques decreased over time for all methods used. Blood lead levels of the high-risk group revealed a 2-fold reduction (prevalence rate ratio = 8.3; 95% confidence interval, 3.7-18.6) and in the low-risk group revealed a 1.6-fold reduction (prevalence rate ratio = 6.2; 95% confidence interval, 0.86-44.7). The Coast Guard Yard runs an effective lead program that exceeds the national Healthy People 2010 goal for lead. The results validate the Coast Guard Yard's use of air-line respirators and lead-free paint on all vessels.

  14. Shuttle user analysis (study 2.2). Volume 3: Business risk and value of operations in space (BRAVO). Part 5: Analysis of GSFC Earth Observation Satellite (EOS) system mission model using BRAVO techniques

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Cost comparisons were made between three modes of operation (expend, ground refurbish, and space resupply) for the Earth Observation System (EOS-B) to furnish data to NASA on alternative ways to use the shuttle/EOS. Results of the analysis are presented in tabular form.

  15. [Extemporaneous withdrawal with a mini-spike filter: A low infection risk technique for drawing up bevacizumab for intravitreal injection].

    PubMed

    Le Rouic, J F; Breger, D; Peronnet, P; Hermouet-Leclair, E; Alphandari, A; Pousset-Decré, C; Badat, I; Becquet, F

    2016-05-01

    To describe a technique for extemporaneously drawing up bevacizumab for intravitreal injection (IVT) and report the rate of post-injection endophthtalmitis. Retrospective monocentric analysis (January 2010-December 2014) of all IVT of bevacizumab drawn up with the following technique: in the operating room (class ISO 7) through a mini-spike with an integrated bacteria retentive air filter. The surgeon was wearing sterile gloves and a mask. The assisting nurse wore a mask. The bevacizumab vial was discarded at the end of each session. Six thousand two hundred and thirty-six bevacizumab injections were performed. One case of endophthalmitis was noted (0.016%). During the same period, 4 cases of endophthalmitis were found after IVT of other drugs (4/32,992; 0.012%. P=0.8). Intravitreal injection of bevacizumab after extemporaneous withdrawal through a mini-spike filter is a simple and safe technique. The risk of postoperative endophthalmitis is very low. This simple technique facilitates access to compounded bevacizumab. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  16. High-risk of preterm birth and low birth weight after oocyte donation IVF: analysis of 133,785 live births.

    PubMed

    Kamath, Mohan Shashikant; Antonisamy, Belavendra; Mascarenhas, Mariano; Sunkara, Sesh Kamal

    2017-09-01

    A higher risk of pregnancy complications occurs after assisted reproductive techniques compared with spontaneously conceived pregnancies. This is attributed to the underlying infertility and assisted reproduction technique procedures involved during treatment. It is a matter of interest whether use of donor oocytes affects perinatal outcomes compared with pregnancies after autologous IVF. Anonymized data were obtained from the Human Fertilization and Embryology Authority. The analysis included 5929 oocyte donation and 127,856 autologous IVF live births. Data from all women who underwent donor oocyte recipient or autologous IVF cycles, both followed with fresh embryo transfer, were analysed to compare perinatal outcomes of preterm birth (PTB) and low birthweight (LBW) after singleton and multiple live births. The risk of adverse perinatal outcomes after oocyte donation was increased: adjusted OR (aOR) 1.56, 99.5% CI 1.34 to 1.80 for PTB and aOR 1.43, 99.5% CI 1.24 to 1.66 for LBW were significantly higher after oocyte donation compared with autologous IVF singletons. The adjusted odds PTB (aOR 1.21, 99.5% CI 1.02 to 1.43) was significantly higher after oocyte donation compared with autologous IVF multiple births. Analysis of this large dataset suggests significantly higher risk of PTB and LBW after ooctye donation compared with autologous IVF pregnancies. Copyright © 2017 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  17. An optimized posterior axillary boost technique in radiation therapy to supraclavicular and axillary lymph nodes: A comparative study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez, Victor, E-mail: vhernandezmasgrau@gmail.com; Arenas, Meritxell; Müller, Katrin

    2013-01-01

    To assess the advantages of an optimized posterior axillary (AX) boost technique for the irradiation of supraclavicular (SC) and AX lymph nodes. Five techniques for the treatment of SC and levels I, II, and III AX lymph nodes were evaluated for 10 patients selected at random: a direct anterior field (AP); an anterior to posterior parallel pair (AP-PA); an anterior field with a posterior axillary boost (PAB); an anterior field with an anterior axillary boost (AAB); and an optimized PAB technique (OptPAB). The target coverage, hot spots, irradiated volume, and dose to organs at risk were evaluated and a statisticalmore » analysis comparison was performed. The AP technique delivered insufficient dose to the deeper AX nodes. The AP-PA technique produced larger irradiated volumes and higher mean lung doses than the other techniques. The PAB and AAB techniques originated excessive hot spots in most of the cases. The OptPAB technique produced moderate hot spots while maintaining a similar planning target volume (PTV) coverage, irradiated volume, and dose to organs at risk. This optimized technique combines the advantages of the PAB and AP-PA techniques, with moderate hot spots, sufficient target coverage, and adequate sparing of normal tissues. The presented technique is simple, fast, and easy to implement in routine clinical practice and is superior to the techniques historically used for the treatment of SC and AX lymph nodes.« less

  18. Development of Improved Caprock Integrity and Risk Assessment Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruno, Michael

    GeoMechanics Technologies has completed a geomechanical caprock integrity analysis and risk assessment study funded through the US Department of Energy. The project included: a detailed review of historical caprock integrity problems experienced in the natural gas storage industry; a theoretical description and documentation of caprock integrity issues; advanced coupled transport flow modelling and geomechanical simulation of three large-scale potential geologic sequestration sites to estimate geomechanical effects from CO₂ injection; development of a quantitative risk and decision analysis tool to assess caprock integrity risks; and, ultimately the development of recommendations and guidelines for caprock characterization and CO₂ injection operating practices. Historicalmore » data from gas storage operations and CO₂ sequestration projects suggest that leakage and containment incident risks are on the order of 10-1 to 10-2, which is higher risk than some previous studies have suggested for CO₂. Geomechanical analysis, as described herein, can be applied to quantify risks and to provide operating guidelines to reduce risks. The risk assessment tool developed for this project has been applied to five areas: The Wilmington Graben offshore Southern California, Kevin Dome in Montana, the Louden Field in Illinois, the Sleipner CO₂ sequestration operation in the North Sea, and the In Salah CO₂ sequestration operation in North Africa. Of these five, the Wilmington Graben area represents the highest relative risk while the Kevin Dome area represents the lowest relative risk.« less

  19. Risk assessment and optimization (ALARA) analysis for the environmental remediation of Brookhaven National Laboratory`s hazardous waste management facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dionne, B.J.; Morris, S.C. III; Baum, J.W.

    1998-01-01

    The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less

  20. MERINOVA: Meteorological risks as drivers of environmental innovation in agro-ecosystem management

    NASA Astrophysics Data System (ADS)

    Gobin, Anne; Oger, Robert; Marlier, Catherine; Van De Vijver, Hans; Vandermeulen, Valerie; Van Huylenbroeck, Guido; Zamani, Sepideh; Curnel, Yannick; Mettepenningen, Evi

    2013-04-01

    The BELSPO funded project 'MERINOVA' deals with risks associated with extreme weather phenomena and with risks of biological origin such as pests and diseases. The major objectives of the proposed project are to characterise extreme meteorological events, assess the impact on Belgian agro-ecosystems, characterise their vulnerability and resilience to these events, and explore innovative adaptation options to agricultural risk management. The project comprises of five major parts that reflect the chain of risks: (i) Hazard: Assessing the likely frequency and magnitude of extreme meteorological events by means of probability density functions; (ii) Impact: Analysing the potential bio-physical and socio-economic impact of extreme weather events on agro-ecosystems in Belgium using process-based modelling techniques commensurate with the regional scale; (iii) Vulnerability: Identifying the most vulnerable agro-ecosystems using fuzzy multi-criteria and spatial analysis; (iv) Risk Management: Uncovering innovative risk management and adaptation options using actor-network theory and fuzzy cognitive mapping techniques; and, (v) Communication: Communicating to research, policy and practitioner communities using web-based techniques. The different tasks of the MERINOVA project require expertise in several scientific disciplines: meteorology, statistics, spatial database management, agronomy, bio-physical impact modelling, socio-economic modelling, actor-network theory, fuzzy cognitive mapping techniques. These expertises are shared by the four scientific partners who each lead one work package. The MERINOVA project will concentrate on promoting a robust and flexible framework by demonstrating its performance across Belgian agro-ecosystems, and by ensuring its relevance to policy makers and practitioners. Impacts developed from physically based models will not only provide information on the state of the damage at any given time, but also assist in understanding the links between different factors causing damage and determining bio-physical vulnerability. Socio-economic impacts will enlarge the basis for vulnerability mapping, risk management and adaptation options. A strong expert and end-user network will be established to help disseminating and exploiting project results to meet user needs.

  1. [The implementation of polymerase chain reaction technique: the real time to reveal and differentiate the viruses of human papilloma of high carcinogenic risk].

    PubMed

    Andosova, L D; Kontorshchikova, K N; Blatova, O L; Kudel'kina, S Iu; Kuznetsova, I A; Belov, A V; Baĭkova, R A

    2011-07-01

    The polymerase chain reaction technique was applied in "real time" format to evaluate the occurrence rate and infection ratio of various genotypes of human papilloma of high carcinogenic risk in virus-positive women and contact persons. The examination sampling consisted of 738 women aged of 17-50 years. The examination results permitted to establish high percentage of infection of 546 patients (74%) by carcinogenic papilloma viruses. The analysis of detection rate of various genotypes of human papilloma of high carcinogenic risk established that the 56th and 16th types of high carcinogenic risk are revealed more often than others--in 33% and 15.4% correspondingly. In males, first place in occurrence rate is for those types of virus of human papilloma: the 56th n = 10 (33.3%), 16th n = 3 (10%), 45th n = 3 (10%), 51th n = 3 (10%). The rest of genotypes are detected in 3-7% cases.

  2. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  3. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  4. Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments? A systematic review and meta analysis.

    PubMed

    Al-Moraissi, E A; Elmansi, Y A; Al-Sharaee, Y A; Alrmali, A E; Alkhutari, A S

    2016-03-01

    A systematic review and meta-analysis was conducted to answer the clinical question "Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments?" A systematic and electronic search of several databases with specific key words, a reference search, and a manual search were performed from respective dates of inception through November 2014. The inclusion criteria were clinical human studies, including randomized controlled trials (RCTs), controlled clinical trials (CCTs), and retrospective studies, with the aim of comparing the piezoelectric surgical osteotomy technique to the standard rotary instrument technique in lower third molar surgery. Postoperative sequelae (oedema, trismus, and pain), the total number of analgesics taken, and the duration of surgery were analyzed. A total of nine articles were included, six RCTs, two CCTs, and one retrospective study. Six studies had a low risk of bias and three had a moderate risk of bias. A statistically significant difference was found between piezoelectric surgery and conventional rotary instrument surgery for lower third molar extraction with regard to postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken (P=0.0001, P=0.0001, P<0.00001, and P<0.0001, respectively). However, a statistically significant increased surgery time was required in the piezoelectric osteotomy group (P<0.00001). The results of the meta-analysis showed that piezoelectric surgery significantly reduced the occurrence of postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken compared to the conventional rotary instrument technique in lower third molar surgery, but required a longer surgery time. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  5. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  6. U.S. Air Force Research Laboratory's Need for Flow Physics and Control With Applications Involving Aero-Optics and Weapon Bay Cavities

    NASA Technical Reports Server (NTRS)

    Schmit, Ryan

    2010-01-01

    To develop New Flow Control Techniques: a) Knowledge of the Flow Physics with and without control. b) How does Flow Control Effect Flow Physics (What Works to Optimize the Design?). c) Energy or Work Efficiency of the Control Technique (Cost - Risk - Benefit Analysis). d) Supportability, e.g. (size of equipment, computational power, power supply) (Allows Designer to include Flow Control in Plans).

  7. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  8. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System.

    PubMed

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure ( P <0.05). Multivariate logistic regression analysis showed no statistically significant relationship ( P >0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant ( P <0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y.

  9. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System

    PubMed Central

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027

  10. Improving default risk prediction using Bayesian model uncertainty techniques.

    PubMed

    Kazemi, Reza; Mosleh, Ali

    2012-11-01

    Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.

  11. Guidance on individual monitoring programmes for radioisotopic techniques in molecular and cellular biology.

    PubMed

    Macías, M T; Navarro, T; Lavara, A; Robredo, L M; Sierra, I; Lopez, M A

    2003-01-01

    The radioisotope techniques used in molecular and cellular biology involve external and internal irradiation risk. The personal dosemeter may be a reasonable indicator for external irradiation. However, it is necessary to control the possible internal contamination associated with the development of these techniques. The aim of this project is to analyse the most usual techniques and to establish programmes of internal monitoring for specific radionuclides (32P, 35S, 14C, 3H, 125I and 131I). To elaborate these programmes it was necessary to analyse the radioisotope techniques. Two models have been applied (NRPB and IAEA) to the more significant techniques, according to the physical and chemical nature of the radionuclides, their potential importance in occupational exposure and the possible injury to the genetic material of the cell. The results allowed the identification of the techniques with possible risk of internal contamination. It was necessary to identify groups of workers that require individual monitoring. The risk groups have been established among the professionals exposed, according to different parameters: the general characteristics of receptor, the radionuclides used (the same user can work with one, two or three radionuclides at the same time) and the results of the models applied. Also a control group was established. The study of possible intakes in these groups has been made by urinalysis and whole-body counter. The theoretical results are coherent with the experimental results. They have allowed guidance to individual monitoring to be proposed. Basically, the document shows: (1) the analysis of the radiosotopic techniques, taking into account the special containment equipment; (2) the establishment of the need of individual monitoring; and (3) the required frequency of measurements in a routine programme.

  12. Geotechnical approach for occupational safety risk analysis of critical slope in open pit mining as implication for earthquake hazard

    NASA Astrophysics Data System (ADS)

    Munirwansyah; Irsyam, Masyhur; Munirwan, Reza P.; Yunita, Halida; Zulfan Usrina, M.

    2018-05-01

    Occupational safety and health (OSH) is a planned effort to prevent accidents and diseases caused by work. In conducting mining activities often occur work accidents caused by unsafe field conditions. In open mine area, there is often a slump due to unstable slopes, which can disrupt the activities and productivity of mining companies. Based on research on stability of open pit slopes conducted by Febrianti [8], the Meureubo coal mine located in Aceh Barat district, on the slope of mine was indicated unsafe slope conditions, it will be continued research on OSH for landslide which is to understand the stability of the excavation slope and the shape of the slope collapse. Plaxis software was used for this research. After analyzing the slope stability and the effect of landslide on OSH with Job Safety Analysis (JSA) method, to identify the hazard to work safety, risk management analysis will be conducted to classified hazard level and its handling technique. This research aim is to know the level of risk of work accident at the company and its prevention effort. The result of risk analysis research is very high-risk value that is > 350 then the activity must be stopped until the risk can be reduced to reach the risk value limit < 20 which is allowed or accepted.

  13. Promoting patient safety through prospective risk identification: example from peri-operative care.

    PubMed

    Smith, A; Boult, M; Woods, I; Johnson, S

    2010-02-01

    Investigation of patient safety incidents has focused on retrospective analyses once incidents have occurred. Prospective risk analysis techniques complement this but have not been widely used in healthcare. Prospective risk identification of non-operative risks associated with adult elective surgery under general anaesthesia using a customised structured "what if" checklist and development of risk matrix. Prioritisation of recommendations arising by cost, ease and likely speed of implementation. Groups totalling 20 clinical and administrative healthcare staff involved in peri-operative care and risk experts convened by the UK National Patient Safety Agency. 102 risks were identified and 95 recommendations made. The top 20 recommendations together were judged to encompass about 75% of the total estimated risk attributable to the processes considered. Staffing and organisational issues (21% of total estimated risk) included recommendations for removing distractions from the operating theatre, ensuring the availability of senior anaesthetists and promoting standards and flexible working among theatre staff. Devices and equipment (19% of total estimated risk) could be improved by training and standardisation; airway control and temperature monitoring were identified as two specific areas. Pre-assessment of patients before admission to hospital (12% of estimated risk) could be improved by defining a data set for adequate pre-assessment and making this available throughout the NHS. This technique can be successfully applied by healthcare staff but expert facilitation of groups is advisable. Such wider-ranging processes can potentially lead to more comprehensive risk reduction than "single-issue" risk alerts.

  14. Speciation evolution of zinc and copper during pyrolysis and hydrothermal carbonization treatments of sewage sludges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Rixiang; Zhang, Bei; Saad, Emily M.

    Thermal and hydrothermal treatments are promising techniques for sewage sludge management that can potentially facilitate safe waste disposal, energy recovery, and nutrient recovery/recycling. Content and speciation of heavy metals in the treatment products affect the potential environmental risks upon sludge disposal and/or application of the treatment products. Therefore, it is important to study the speciation transformation of heavy metals and the effects of treatment conditions. By combining synchrotron X-ray spectroscopy/microscopy analysis and sequential chemical extraction, this study systematically characterized the speciation of Zn and Cu in municipal sewage sludges and their chars derived from pyrolysis (a representative thermal treatment technique)more » and hydrothermal carbonization (HTC; a representative hydrothermal treatment technique). Spectroscopy analysis revealed enhanced sulfidation of Zn and Cu by anaerobic digestion and HTC treatments, as compared to desulfidation by pyrolysis. Overall, changes in the chemical speciation and matrix properties led to reduced mobility of Zn and Cu in the treatment products. These results provide insights into the reaction mechanisms during pyrolysis and HTC treatments of sludges and can help evaluate the environmental/health risks associated with the metals in the treatment products.« less

  15. Double-bundle anterior cruciate ligament reconstruction is superior to single-bundle reconstruction in terms of revision frequency: a study of 22,460 patients from the Swedish National Knee Ligament Register.

    PubMed

    Svantesson, Eleonor; Sundemo, David; Hamrin Senorski, Eric; Alentorn-Geli, Eduard; Musahl, Volker; Fu, Freddie H; Desai, Neel; Stålman, Anders; Samuelsson, Kristian

    2017-12-01

    Studies comparing single- and double-bundle anterior cruciate ligament (ACL) reconstructions often include a combined analysis of anatomic and non-anatomic techniques. The purpose of this study was to compare the revision rates between single- and double-bundle ACL reconstructions in the Swedish National Knee Ligament Register with regard to surgical variables as determined by the anatomic ACL reconstruction scoring checklist (AARSC). Patients from the Swedish National Knee Ligament Register who underwent either single- or double-bundle ACL reconstruction with hamstring tendon autograft during the period 2007-2014 were included. The follow-up period started with primary ACL reconstruction, and the outcome measure was set as revision surgery. An online questionnaire based on the items of the AARSC was used to determine the surgical technique implemented in the single-bundle procedures. These were organized into subgroups based on surgical variables, and the revision rates were compared with the double-bundle ACL reconstruction. Hazard ratios (HR) with 95% confidence interval (CI) was calculated and adjusted for confounders by Cox regression. A total of 22,460 patients were included in the study, of which 21,846 were single-bundle and 614 were double-bundle ACL reconstruction. Double-bundle ACL reconstruction had a revision frequency of 2.0% (n = 12) and single-bundle 3.2% (n = 689). Single-bundle reconstruction had an increased risk of revision surgery compared with double-bundle [adjusted HR 1.98 (95% CI 1.12-3.51), p = 0.019]. The subgroup analysis showed a significantly increased risk of revision surgery in patients undergoing single-bundle with anatomic technique using transportal drilling [adjusted HR 2.51 (95% CI 1.39-4.54), p = 0.002] compared with double-bundle ACL reconstruction. Utilizing a more complete anatomic technique according to the AARSC lowered the hazard rate considerably when transportal drilling was performed but still resulted in significantly increased risk of revision surgery compared with double-bundle ACL reconstruction [adjusted HR 1.87 (95% CI 1.04-3.38), p = 0.037]. Double-bundle ACL reconstruction is associated with a lower risk of revision surgery than single-bundle ACL reconstruction. Single-bundle procedures performed using transportal femoral drilling technique had significantly higher risk of revision surgery compared with double-bundle. However, a reference reconstruction with transportal drilling defined as a more complete anatomic reconstruction reduces the risk of revision surgery considerably. III.

  16. Risk factor analysis of new brain lesions associated with carotid endarterectmy.

    PubMed

    Lee, Jae Hoon; Suh, Bo Yang

    2014-01-01

    Carotid endarterectomy (CEA) is the standard treatment for carotid artery stenosis. New brain ischemia is a major concern associated with CEA and diffusion weighted imaging (DWI) is a good imaging modality for detecting early ischemic brain lesions. We aimed to investigate the surgical complications and identify the potential risk factors for the incidence of new brain lesions (NBL) on DWI after CEA. From January 2006 to November 2011, 94 patients who had been studied by magnetic resonance imaging including DWI within 1 week after CEA were included in this study. Data were retrospectively investigated by review of vascular registry protocol. Seven clinical variables and three procedural variables were analyzed as risk factors for NBL after CEA. The incidence of periprocedural NBL on DWI was 27.7%. There were no fatal complications, such as ipsilateral disabling stroke, myocardial infarction or mortality. A significantly higher incidence of NBL was found in ulcer positive patients as opposed to ulcer negative patients (P = 0.029). The incidence of NBL after operation was significantly higher in patients treated with conventional technique than with eversion technique (P = 0.042). Our data shows CEA has acceptable periprocedural complication rates and the existence of ulcerative plaque and conventional technique of endarterectomy are high risk factors for NBL development after CEA.

  17. Laser power conversion system analysis, volume 1

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-orbit laser energy conversion system analysis established a mission model of satellites with various orbital parameters and average electrical power requirements ranging from 1 to 300 kW. The system analysis evaluated various conversion techniques, power system deployment parameters, power system electrical supplies and other critical supplies and other critical subsystems relative to various combinations of the mission model. The analysis show that the laser power system would not be competitive with current satellite power systems from weight, cost and development risk standpoints.

  18. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  19. Novel sensing technology in fall risk assessment in older adults: a systematic review.

    PubMed

    Sun, Ruopeng; Sosnoff, Jacob J

    2018-01-16

    Falls are a major health problem for older adults with significant physical and psychological consequences. A first step of successful fall prevention is to identify those at risk of falling. Recent advancement in sensing technology offers the possibility of objective, low-cost and easy-to-implement fall risk assessment. The objective of this systematic review is to assess the current state of sensing technology on providing objective fall risk assessment in older adults. A systematic review was conducted in accordance to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis statement (PRISMA). Twenty-two studies out of 855 articles were systematically identified and included in this review. Pertinent methodological features (sensing technique, assessment activities, outcome variables, and fall discrimination/prediction models) were extracted from each article. Four major sensing technologies (inertial sensors, video/depth camera, pressure sensing platform and laser sensing) were reported to provide accurate fall risk diagnostic in older adults. Steady state walking, static/dynamic balance, and functional mobility were used as the assessment activity. A diverse range of diagnostic accuracy across studies (47.9% - 100%) were reported, due to variation in measured kinematic/kinetic parameters and modelling techniques. A wide range of sensor technologies have been utilized in fall risk assessment in older adults. Overall, these devices have the potential to provide an accurate, inexpensive, and easy-to-implement fall risk assessment. However, the variation in measured parameters, assessment tools, sensor sites, movement tasks, and modelling techniques, precludes a firm conclusion on their ability to predict future falls. Future work is needed to determine a clinical meaningful and easy to interpret fall risk diagnosis utilizing sensing technology. Additionally, the gap between functional evaluation and user experience to technology should be addressed.

  20. Spatial analysis on human brucellosis incidence in mainland China: 2004–2010

    PubMed Central

    Zhang, Junhui; Yin, Fei; Zhang, Tao; Yang, Chao; Zhang, Xingyu; Feng, Zijian; Li, Xiaosong

    2014-01-01

    Objectives China has experienced a sharply increasing rate of human brucellosis in recent years. Effective spatial monitoring of human brucellosis incidence is very important for successful implementation of control and prevention programmes. The purpose of this paper is to apply exploratory spatial data analysis (ESDA) methods and the empirical Bayes (EB) smoothing technique to monitor county-level incidence rates for human brucellosis in mainland China from 2004 to 2010 by examining spatial patterns. Methods ESDA methods were used to characterise spatial patterns of EB smoothed incidence rates for human brucellosis based on county-level data obtained from the China Information System for Disease Control and Prevention (CISDCP) in mainland China from 2004 to 2010. Results EB smoothed incidence rates for human brucellosis were spatially dependent during 2004–2010. The local Moran test identified significantly high-risk clusters of human brucellosis (all p values <0.01), which persisted during the 7-year study period. High-risk counties were centred in the Inner Mongolia Autonomous Region and other Northern provinces (ie, Hebei, Shanxi, Jilin and Heilongjiang provinces) around the border with the Inner Mongolia Autonomous Region where animal husbandry was highly developed. The number of high-risk counties increased from 25 in 2004 to 54 in 2010. Conclusions ESDA methods and the EB smoothing technique can assist public health officials in identifying high-risk areas. Allocating more resources to high-risk areas is an effective way to reduce human brucellosis incidence. PMID:24713215

  1. Preparing data for analysis using microsoft Excel.

    PubMed

    Elliott, Alan C; Hynan, Linda S; Reisch, Joan S; Smith, Janet P

    2006-09-01

    A critical component essential to good research is the accurate and efficient collection and preparation of data for analysis. Most medical researchers have little or no training in data management, often causing not only excessive time spent cleaning data but also a risk that the data set contains collection or recording errors. The implementation of simple guidelines based on techniques used by professional data management teams will save researchers time and money and result in a data set better suited to answer research questions. Because Microsoft Excel is often used by researchers to collect data, specific techniques that can be implemented in Excel are presented.

  2. Statistical innovations in the medical device world sparked by the FDA.

    PubMed

    Campbell, Gregory; Yue, Lilly Q

    2016-01-01

    The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.

  3. Early detection of coronary artery disease by 64-slice multidetector computed tomography in asymptomatic hypertensive high-risk patients.

    PubMed

    Gaudio, Carlo; Mirabelli, Francesca; Pelliccia, Francesco; Francone, Marco; Tanzilli, Gaetano; Di Michele, Sara; Leonetti, Stefania; De Vincentis, Giuseppe; Carbone, Iacopo; Mangieri, Enrico; Catalano, Carlo; Passariello, Roberto

    2009-07-10

    The 64-slice multidetector-row computed tomography (MDCT) is an accurate noninvasive technique for assessing the degree of luminal narrowing in coronary arteries of patients with chronic ischemic disease. Aim of this study was to determine the value of MDCT in comparison to invasive coronary angiography (ICA) for detecting the presence and extent of coronary atherosclerotic plaques in a population of asymptomatic, hypertensive patients considered to be at high risk for cardiovascular events. We studied 67 asymptomatic, hypertensive patients at high-risk (Euro Score >5%). All patients had negative or nondiagnostic findings at exercise stress testing and therefore underwent both MDCT and ICA. In the per-patient analysis, MDCT correctly identified 16/17 (94%) patients with significant coronary artery disease involving at least 1 vessel and 48/50 (96%) normal subjects. In the per-segment analysis, MDCT correctly detected 21/22 (95%) coronary segments with a stenosis >or=50% and 856/868 (98%) normal segments, with a high negative predictivity of normal scans (100%). There was a good concordance between MDCT and ICA, with a high Pearson correlation coefficient between the coronary narrowings with the two techniques (r=0.84, p<0.01). Mean coronary calcium score was higher for the 17 patients with significant coronary artery disease on ICA than in the 50 patients without (422+/-223 HU vs 72+/-21 HU p<0.001). The ROC curves identified 160 as the best calcium volumetric score cut-off value able to identify >or=1 significant coronary stenosis with sensitivity 88% and specificity 85%. MDCT is an excellent noninvasive technique for early identification of significant coronary stenoses in high risk asymptomatic hypertensive patients and might provide unique information for the screening of this broad population.

  4. Long-term follow-up results of umbilical hernia repair

    PubMed Central

    Venclauskas, Linas; Zilinskas, Justas; Zviniene, Kristina; Kiudelis, Mindaugas

    2017-01-01

    Introduction Multiple suture techniques and various mesh repairs are used in open or laparoscopic umbilical hernia (UH) surgery. Aim To compare long-term follow-up results of UH repair in different hernia surgery groups and to identify risk factors for UH recurrence. Material and methods A retrospective analysis of 216 patients who underwent elective surgery for UH during a 10-year period was performed. The patients were divided into three groups according to surgery technique (suture, mesh and laparoscopic repair). Early and long-term follow-up results including hospital stay, postoperative general and wound complications, recurrence rate and postoperative patient complaints were reviewed. Risk factors for recurrence were also analyzed. Results One hundred and forty-six patients were operated on using suture repair, 52 using open mesh and 18 using laparoscopic repair technique. 77.8% of patients underwent long-term follow-up. The postoperative wound complication rate and long-term postoperative complaints were significantly higher in the open mesh repair group. The overall hernia recurrence rate was 13.1%. Only 2 (1.7%) patients with small hernias (< 2 cm) had a recurrence in the suture repair group. Logistic regression analysis showed that body mass index (BMI) > 30 kg/m2, diabetes and wound infection were independent risk factors for umbilical hernia recurrence. Conclusions The overall umbilical hernia recurrence rate was 13.1%. Body mass index > 30 kg/m2, diabetes and wound infection were independent risk factors for UH recurrence. According to our study results, laparoscopic medium and large umbilical hernia repair has slight advantages over open mesh repair concerning early postoperative complications, long-term postoperative pain and recurrence. PMID:29362649

  5. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  6. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    PubMed

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Statistical correlations and risk analyses techniques for a diving dual phase bubble model and data bank using massively parallel supercomputers.

    PubMed

    Wienke, B R; O'Leary, T R

    2008-05-01

    Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.

  8. Efficacy of preparation solutions and cleansing techniques on contamination of the skin in foot and ankle surgery: A systematic review and meta-analysis.

    PubMed

    Yammine, K; Harvey, A

    2013-04-01

    We report a systematic review and meta-analysis of published randomised and quasi-randomised trials evaluating the efficacy of pre-operative skin antisepsis and cleansing techniques in reducing foot and ankle skin flora. The post-preparation culture number (Post-PCN) was the primary outcome. The data were evaluated using a modified version of the Cochrane Collaboration’s tool. We identified eight trials (560 participants, 716 feet) that met the inclusion criteria. There was a significant difference in the proportions of Post-PCN between hallux nailfold (HNF) and toe web spaces (TWS) sites: 0.47 vs 0.22, respectively (95% confidence interval (CI) 0.182937 to 0.304097; p < 0.0001). Meta-analyses showed that alcoholic chlorhexidine had better efficacy than alcoholic povidone-iodine (PI) at HNF sites (risk difference 0.19 (95% CI 0.08 to 0.30); p = 0.0005); a two-step intervention using PI scrub and paint (S&P) followed by alcohol showed significantly better efficacy over PI (S&P) alone at TWS sites (risk difference 0.13 (95% CI 0.02 to 0.24); p = 0.0169); and a two-step intervention using chlorhexidine scrub followed by alcohol showed significantly better efficacy over PI (S&P) alone at the combined (HNF with TWS) sites (risk difference 0.27 (95% CI 0.13 to 0.40); p < 0.0001). No significant difference was found between cleansing techniques.

  9. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    PubMed

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  10. Improving the accuracy of effect-directed analysis: the role of bioavailability.

    PubMed

    You, Jing; Li, Huizhen

    2017-12-13

    Aquatic ecosystems have been suffering from contamination by multiple stressors. Traditional chemical-based risk assessment usually fails to explain the toxicity contributions from contaminants that are not regularly monitored or that have an unknown identity. Diagnosing the causes of noted adverse outcomes in the environment is of great importance in ecological risk assessment and in this regard effect-directed analysis (EDA) has been designed to fulfill this purpose. The EDA approach is now increasingly used in aquatic risk assessment owing to its specialty in achieving effect-directed nontarget analysis; however, a lack of environmental relevance makes conventional EDA less favorable. In particular, ignoring the bioavailability in EDA may cause a biased and even erroneous identification of causative toxicants in a mixture. Taking bioavailability into consideration is therefore of great importance to improve the accuracy of EDA diagnosis. The present article reviews the current status and applications of EDA practices that incorporate bioavailability. The use of biological samples is the most obvious way to include bioavailability into EDA applications, but its development is limited due to the small sample size and lack of evidence for metabolizable compounds. Bioavailability/bioaccessibility-based extraction (bioaccessibility-directed and partitioning-based extraction) and passive-dosing techniques are recommended to be used to integrate bioavailability into EDA diagnosis in abiotic samples. Lastly, the future perspectives of expanding and standardizing the use of biological samples and bioavailability-based techniques in EDA are discussed.

  11. Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.

    PubMed

    Khan, F I; Abbasi, S A

    2000-07-10

    Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.

  12. Selected considerations of implementation of the GNSS

    NASA Astrophysics Data System (ADS)

    Cwiklak, Janusz; Fellner, Andrzej; Fellner, Radoslaw; Jafernik, Henryk; Sledzinski, Janusz

    2014-05-01

    The article describes analysis of the safety and risk for the implementation of precise approach procedures (Localizer Performance and Vertical Guidance - LPV) with GNSS sensor at airports in Warsaw and Katowice. There were used some techniques of the identification of threats (inducing controlled flight into terrain, landing accident, mid-air collision) and evaluations methods based on Fault Tree Analysis, probability of the risk, safety risk evaluation matrix and Functional Hazard Assesment. Also safety goals were determined. Research led to determine probabilities of appearing of threats, as well as allow compare them with regard to the ILS. As a result of conducting the Preliminary System Safety Assessment (PSSA), there were defined requirements essential to reach the required level of the safety. It is worth to underline, that quantitative requirements were defined using FTA.

  13. Comparative study of stock trend prediction using time delay, recurrent and probabilistic neural networks.

    PubMed

    Saad, E W; Prokhorov, D V; Wunsch, D C

    1998-01-01

    Three networks are compared for low false alarm stock trend predictions. Short-term trends, particularly attractive for neural network analysis, can be used profitably in scenarios such as option trading, but only with significant risk. Therefore, we focus on limiting false alarms, which improves the risk/reward ratio by preventing losses. To predict stock trends, we exploit time delay, recurrent, and probabilistic neural networks (TDNN, RNN, and PNN, respectively), utilizing conjugate gradient and multistream extended Kalman filter training for TDNN and RNN. We also discuss different predictability analysis techniques and perform an analysis of predictability based on a history of daily closing price. Our results indicate that all the networks are feasible, the primary preference being one of convenience.

  14. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  15. DISEASE RISK ANALYSIS--A TOOL FOR POLICY MAKING WHEN EVIDENCE IS LACKING: IMPORT OF RABIES-SUSCEPTIBLE ZOO MAMMALS AS A MODEL.

    PubMed

    Hartley, Matt; Roberts, Helen

    2015-09-01

    Disease control management relies on the development of policy supported by an evidence base. The evidence base for disease in zoo animals is often absent or incomplete. Resources for disease research in these species are limited, and so in order to develop effective policies, novel approaches to extrapolating knowledge and dealing with uncertainty need to be developed. This article demonstrates how qualitative risk analysis techniques can be used to aid decision-making in circumstances in which there is a lack of specific evidence using the import of rabies-susceptible zoo mammals into the United Kingdom as a model.

  16. Usage of information safety requirements in improving tube bending process

    NASA Astrophysics Data System (ADS)

    Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.

    2018-05-01

    This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.

  17. Detection of high-risk atherosclerotic lesions by time-resolved fluorescence spectroscopy based on the Laguerre deconvolution technique

    NASA Astrophysics Data System (ADS)

    Jo, J. A.; Fang, Q.; Papaioannou, T.; Qiao, J. H.; Fishbein, M. C.; Beseth, B.; Dorafshar, A. H.; Reil, T.; Baker, D.; Freischlag, J.; Marcu, L.

    2006-02-01

    This study introduces new methods of time-resolved laser-induced fluorescence spectroscopy (TR-LIFS) data analysis for tissue characterization. These analytical methods were applied for the detection of atherosclerotic vulnerable plaques. Upon pulsed nitrogen laser (337 nm, 1 ns) excitation, TR-LIFS measurements were obtained from carotid atherosclerotic plaque specimens (57 endarteroctomy patients) at 492 distinct areas. The emission was both spectrally- (360-600 nm range at 5 nm interval) and temporally- (0.3 ns resolution) resolved using a prototype clinically compatible fiber-optic catheter TR-LIFS apparatus. The TR-LIFS measurements were subsequently analyzed using a standard multiexponential deconvolution and a recently introduced Laguerre deconvolution technique. Based on their histopathology, the lesions were classified as early (thin intima), fibrotic (collagen-rich intima), and high-risk (thin cap over necrotic core and/or inflamed intima). Stepwise linear discriminant analysis (SLDA) was applied for lesion classification. Normalized spectral intensity values and Laguerre expansion coefficients (LEC) at discrete emission wavelengths (390, 450, 500 and 550 nm) were used as features for classification. The Laguerre based SLDA classifier provided discrimination of high-risk lesions with high sensitivity (SE>81%) and specificity (SP>95%). Based on these findings, we believe that TR-LIFS information derived from the Laguerre expansion coefficients can provide a valuable additional dimension for the diagnosis of high-risk vulnerable atherosclerotic plaques.

  18. Anterior segment sparing to reduce charged particle radiotherapy complications in uveal melanoma

    NASA Technical Reports Server (NTRS)

    Daftari, I. K.; Char, D. H.; Verhey, L. J.; Castro, J. R.; Petti, P. L.; Meecham, W. J.; Kroll, S.; Blakely, E. A.; Chatterjee, A. (Principal Investigator)

    1997-01-01

    PURPOSE: The purpose of this investigation is to delineate the risk factors in the development of neovascular glaucoma (NVG) after helium-ion irradiation of uveal melanoma patients and to propose treatment technique that may reduce this risk. METHODS AND MATERIALS: 347 uveal melanoma patients were treated with helium-ions using a single-port treatment technique. Using univariate and multivariate statistics, the NVG complication rate was analyzed according to the percent of anterior chamber in the radiation field, tumor size, tumor location, sex, age, dose, and other risk factors. Several University of California San Francisco-Lawrence Berkeley National Laboratory (LBNL) patients in each size category (medium, large, and extralarge) were retrospectively replanned using two ports instead of a single port. By using appropriate polar and azimuthal gaze angles or by treating patients with two ports, the maximum dose to the anterior segment of the eye can often be reduced. Although a larger volume of anterior chamber may receive a lower dose by using two ports than a single port treatment. We hypothesize that this could reduce the level of complications that result from the irradiation of the anterior chamber of the eye. Dose-volume histograms were calculated for the lens, and compared for the single and two-port techniques. RESULTS: NVG developed in 121 (35%) patients. The risk of NVG peaked between 1 and 2.5 years posttreatment. By univariate and multivariate analysis, the percent of lens in the field was strongly correlated with the development of NVG. Other contributing factors were tumor height, history of diabetes, and vitreous hemorrhage. Dose-volume histogram analysis of single-port vs. two-port techniques demonstrate that for some patients in the medium and large category tumor groups, a significant decrease in dose to the structures in the anterior segment of the eye could have been achieved with the use of two ports. CONCLUSION: The development of NVG after helium-ion irradiation is correlated to the amount of lens, anterior chamber in the treatment field, tumor height, proximity to the fovea, history of diabetes, and the development of vitreous hemorrhage. Although the influence of the higher LET deposition of helium-ions is unclear, this study suggests that by reducing the dose to the anterior segment of the eye may reduce the NVG complications. Based on this retrospective analysis of LBNL patients, we have implemented techniques to reduce the amount of the anterior segment receiving a high dose in our new series of patients treated with protons using the cyclotron at the UC Davis Crocker Nuclear Laboratory (CNL).

  19. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  20. Analysis of flood modeling through innovative geomatic methods

    NASA Astrophysics Data System (ADS)

    Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo

    2015-05-01

    A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model, has allowed reaching a significant increase in the knowledge of hydraulic variables in natural waterways.

  1. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  2. An overview of techniques for linking high-dimensional molecular data to time-to-event endpoints by risk prediction models.

    PubMed

    Binder, Harald; Porzelius, Christine; Schumacher, Martin

    2011-03-01

    Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A Meta-Analysis and Multisite Time-Series Analysis of the Differential Toxicity of Major Fine Particulate Matter Constituents

    PubMed Central

    Levy, Jonathan I.; Diez, David; Dou, Yiping; Barr, Christopher D.; Dominici, Francesca

    2012-01-01

    Health risk assessments of particulate matter less than 2.5 μm in diameter (PM2.5) often assume that all constituents of PM2.5 are equally toxic. While investigators in previous epidemiologic studies have evaluated health risks from various PM2.5 constituents, few have conducted the analyses needed to directly inform risk assessments. In this study, the authors performed a literature review and conducted a multisite time-series analysis of hospital admissions and exposure to PM2.5 constituents (elemental carbon, organic carbon matter, sulfate, and nitrate) in a population of 12 million US Medicare enrollees for the period 2000–2008. The literature review illustrated a general lack of multiconstituent models or insight about probabilities of differential impacts per unit of concentration change. Consistent with previous results, the multisite time-series analysis found statistically significant associations between short-term changes in elemental carbon and cardiovascular hospital admissions. Posterior probabilities from multiconstituent models provided evidence that some individual constituents were more toxic than others, and posterior parameter estimates coupled with correlations among these estimates provided necessary information for risk assessment. Ratios of constituent toxicities, commonly used in risk assessment to describe differential toxicity, were extremely uncertain for all comparisons. These analyses emphasize the subtlety of the statistical techniques and epidemiologic studies necessary to inform risk assessments of particle constituents. PMID:22510275

  4. Heart failure disease management programs: a cost-effectiveness analysis.

    PubMed

    Chan, David C; Heidenreich, Paul A; Weinstein, Milton C; Fonarow, Gregg C

    2008-02-01

    Heart failure (HF) disease management programs have shown impressive reductions in hospitalizations and mortality, but in studies limited to short time frames and high-risk patient populations. Current guidelines thus only recommend disease management targeted to high-risk patients with HF. This study applied a new technique to infer the degree to which clinical trials have targeted patients by risk based on observed rates of hospitalization and death. A Markov model was used to assess the incremental life expectancy and cost of providing disease management for high-risk to low-risk patients. Sensitivity analyses of various long-term scenarios and of reduced effectiveness in low-risk patients were also considered. The incremental cost-effectiveness ratio of extending coverage to all patients was $9700 per life-year gained in the base case. In aggregate, universal coverage almost quadrupled life-years saved as compared to coverage of only the highest quintile of risk. A worst case analysis with simultaneous conservative assumptions yielded an incremental cost-effectiveness ratio of $110,000 per life-year gained. In a probabilistic sensitivity analysis, 99.74% of possible incremental cost-effectiveness ratios were <$50,000 per life-year gained. Heart failure disease management programs are likely cost-effective in the long-term along the whole spectrum of patient risk. Health gains could be extended by enrolling a broader group of patients with HF in disease management.

  5. Increased pulmonary alveolar-capillary permeability in patients at risk for adult respiratory distress syndrome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tennenberg, S.D.; Jacobs, M.P.; Solomkin, J.S.

    1987-04-01

    Two methods for predicting adult respiratory distress syndrome (ARDS) were evaluated prospectively in a group of 81 multitrauma and sepsis patients considered at clinical high risk. A popular ARDS risk-scoring method, employing discriminant analysis equations (weighted risk criteria and oxygenation characteristics), yielded a predictive accuracy of 59% and a false-negative rate of 22%. Pulmonary alveolar-capillary permeability (PACP) was determined with a radioaerosol lung-scan technique in 23 of these 81 patients, representing a statistically similar subgroup. Lung scanning achieved a predictive accuracy of 71% (after excluding patients with unilateral pulmonary contusion) and gave no false-negatives. We propose a combination of clinicalmore » risk identification and functional determination of PACP to assess a patient's risk of developing ARDS.« less

  6. Magnetic resonance angiography for the nonpalpable testis: a cost and cancer risk analysis.

    PubMed

    Eggener, S E; Lotan, Y; Cheng, E Y

    2005-05-01

    For the unilateral nonpalpable testis standard management is open surgical or laparoscopic exploration. An ideal imaging technique would reliably identify testicular nubbins and safely allow children to forgo surgical exploration without compromising future health or fertility. Our goal was to perform a cost and risk analysis of magnetic resonance angiography (MRA) for unilateral nonpalpable cryptorchid testes. A search of the English medical literature revealed 3 studies addressing the usefulness of MRA for the nonpalpable testicle. We performed a meta-analysis and applied the results to a hypothetical set of patients using historical testicular localization data. Analysis was then performed using 3 different management protocols-MRA with removal of testicular nubbin tissue, MRA with observation of testicular nubbin tissue and diagnostic laparoscopy. A cancer risk and cost analysis was then performed. MRA with observation of testicular nubbin tissue results in 29% of patients avoiding surgery without any increased cost of care. Among the 29% of boys with testicular nubbins left in situ and observed the highest estimated risk was 1 in 300 of cancer developing, and 1 in 5,300 of dying of cancer. A protocol using MRA with observation of inguinal nubbins results in nearly a third of boys avoiding surgical intervention at a similar cost to standard care without any significant increased risk of development of testis cancer.

  7. Rationale and Clinical Techniques for Anterior Cruciate Ligament Injury Prevention Among Female Athletes

    PubMed Central

    Myer, Gregory D; Ford, Kevin R; Hewett, Timothy E

    2004-01-01

    Objective: To present the rationale and detailed techniques for the application of exercises targeted to prevent anterior cruciate ligament (ACL) injury in high-risk female athletes. Background: Female athletes have a 4- to 6-fold increased risk for ACL injury compared with their male counterparts playing at similar levels in the same sports. The increased ACL injury risk coupled with greater sports participation by young women over the last 30 years (9-fold increase in high school and 5-fold increase in collegiate sports) has generated public awareness and fueled several sex-related mechanistic and interventional investigations. These investigations provide the groundwork for the development of neuromuscular training aimed at targeting identified neuromuscular imbalances to decrease ACL injury risk. Description: After the onset of puberty, female athletes may not have a neuromuscular spurt to match their similar, rapid increase in growth and development. The lack of a natural neuromuscular adaptation may facilitate the development of neuromuscular imbalances that increase the risk for ACL injury. Dynamic neuromuscular analysis training provides the methodologic approach for identifying high-risk individuals and the basis of using interventions targeted to their specific needs. Clinical Advantages: Dynamic neuromuscular training applied to the high-risk population may decrease ACL injury risk and help more female athletes enjoy the benefits of sports participation without the long-term disabilities associated with injury. PMID:15592608

  8. Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management.

    PubMed

    Ganin, Alexander A; Quach, Phuoc; Panwar, Mahesh; Collier, Zachary A; Keisler, Jeffrey M; Marchese, Dayton; Linkov, Igor

    2017-09-05

    Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data. Published 2017. This article is a U.S. Government work and is in the public domain in the U.S.A.

  9. Sensitivity analysis for direct and indirect effects in the presence of exposure-induced mediator-outcome confounders

    PubMed Central

    Chiba, Yasutaka

    2014-01-01

    Questions of mediation are often of interest in reasoning about mechanisms, and methods have been developed to address these questions. However, these methods make strong assumptions about the absence of confounding. Even if exposure is randomized, there may be mediator-outcome confounding variables. Inference about direct and indirect effects is particularly challenging if these mediator-outcome confounders are affected by the exposure because in this case these effects are not identified irrespective of whether data is available on these exposure-induced mediator-outcome confounders. In this paper, we provide a sensitivity analysis technique for natural direct and indirect effects that is applicable even if there are mediator-outcome confounders affected by the exposure. We give techniques for both the difference and risk ratio scales and compare the technique to other possible approaches. PMID:25580387

  10. A non-Gaussian approach to risk measures

    NASA Astrophysics Data System (ADS)

    Bormetti, Giacomo; Cisana, Enrica; Montagna, Guido; Nicrosini, Oreste

    2007-03-01

    Reliable calculations of financial risk require that the fat-tailed nature of prices changes is included in risk measures. To this end, a non-Gaussian approach to financial risk management is presented, modelling the power-law tails of the returns distribution in terms of a Student- t distribution. Non-Gaussian closed-form solutions for value-at-risk and expected shortfall are obtained and standard formulae known in the literature under the normality assumption are recovered as a special case. The implications of the approach for risk management are demonstrated through an empirical analysis of financial time series from the Italian stock market and in comparison with the results of the most widely used procedures of quantitative finance. Particular attention is paid to quantify the size of the errors affecting the market risk measures obtained according to different methodologies, by employing a bootstrap technique.

  11. Micrometeoroid and Orbital Debris Risk Assessment With Bumper 3

    NASA Technical Reports Server (NTRS)

    Hyde, J.; Bjorkman, M.; Christiansen, E.; Lear, D.

    2017-01-01

    The Bumper 3 computer code is the primary tool used by NASA for micrometeoroid and orbital debris (MMOD) risk analysis. Bumper 3 (and its predecessors) have been used to analyze a variety of manned and unmanned spacecraft. The code uses NASA's latest micrometeoroid (MEM-R2) and orbital debris (ORDEM 3.0) environment definition models and is updated frequently with ballistic limit equations that describe the hypervelocity impact performance of spacecraft materials. The Bumper 3 program uses these inputs along with a finite element representation of spacecraft geometry to provide a deterministic calculation of the expected number of failures. The Bumper 3 software is configuration controlled by the NASA/JSC Hypervelocity Impact Technology (HVIT) Group. This paper will demonstrate MMOD risk assessment techniques with Bumper 3 used by NASA's HVIT Group. The Permanent Multipurpose Module (PMM) was added to the International Space Station in 2011. A Bumper 3 MMOD risk assessment of this module will show techniques used to create the input model and assign the property IDs. The methodology used to optimize the MMOD shielding for minimum mass while still meeting structural penetration requirements will also be demonstrated.

  12. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  13. Use of a Survival Analysis Technique in Understanding Game Performance in Instructional Games. CRESST Report 812

    ERIC Educational Resources Information Center

    Kim, Jinok; Chung, Gregory K. W. K.

    2012-01-01

    In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…

  14. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement: Advancing Risk Analysis for Nanoscale Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shatkin, J. A.; Ong, Kimberly J.; Beaudrie, Christian

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATSmore » use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article.« less

  15. Advancing Risk Analysis for Nanoscale Materials: Report from an International Workshop on the Role of Alternative Testing Strategies for Advancement.

    PubMed

    Shatkin, J A; Ong, Kimberly J; Beaudrie, Christian; Clippinger, Amy J; Hendren, Christine Ogilvie; Haber, Lynne T; Hill, Myriam; Holden, Patricia; Kennedy, Alan J; Kim, Baram; MacDonell, Margaret; Powers, Christina M; Sharma, Monita; Sheremeta, Lorraine; Stone, Vicki; Sultan, Yasir; Turley, Audrey; White, Ronald H

    2016-08-01

    The Society for Risk Analysis (SRA) has a history of bringing thought leadership to topics of emerging risk. In September 2014, the SRA Emerging Nanoscale Materials Specialty Group convened an international workshop to examine the use of alternative testing strategies (ATS) for manufactured nanomaterials (NM) from a risk analysis perspective. Experts in NM environmental health and safety, human health, ecotoxicology, regulatory compliance, risk analysis, and ATS evaluated and discussed the state of the science for in vitro and other alternatives to traditional toxicology testing for NM. Based on this review, experts recommended immediate and near-term actions that would advance ATS use in NM risk assessment. Three focal areas-human health, ecological health, and exposure considerations-shaped deliberations about information needs, priorities, and the next steps required to increase confidence in and use of ATS in NM risk assessment. The deliberations revealed that ATS are now being used for screening, and that, in the near term, ATS could be developed for use in read-across or categorization decision making within certain regulatory frameworks. Participants recognized that leadership is required from within the scientific community to address basic challenges, including standardizing materials, protocols, techniques and reporting, and designing experiments relevant to real-world conditions, as well as coordination and sharing of large-scale collaborations and data. Experts agreed that it will be critical to include experimental parameters that can support the development of adverse outcome pathways. Numerous other insightful ideas for investment in ATS emerged throughout the discussions and are further highlighted in this article. © 2016 Society for Risk Analysis.

  16. Water pollution risk associated with natural gas extraction from the Marcellus Shale.

    PubMed

    Rozell, Daniel J; Reaven, Sheldon J

    2012-08-01

    In recent years, shale gas formations have become economically viable through the use of horizontal drilling and hydraulic fracturing. These techniques carry potential environmental risk due to their high water use and substantial risk for water pollution. Using probability bounds analysis, we assessed the likelihood of water contamination from natural gas extraction in the Marcellus Shale. Probability bounds analysis is well suited when data are sparse and parameters highly uncertain. The study model identified five pathways of water contamination: transportation spills, well casing leaks, leaks through fractured rock, drilling site discharge, and wastewater disposal. Probability boxes were generated for each pathway. The potential contamination risk and epistemic uncertainty associated with hydraulic fracturing wastewater disposal was several orders of magnitude larger than the other pathways. Even in a best-case scenario, it was very likely that an individual well would release at least 200 m³ of contaminated fluids. Because the total number of wells in the Marcellus Shale region could range into the tens of thousands, this substantial potential risk suggested that additional steps be taken to reduce the potential for contaminated fluid leaks. To reduce the considerable epistemic uncertainty, more data should be collected on the ability of industrial and municipal wastewater treatment facilities to remove contaminants from used hydraulic fracturing fluid. © 2012 Society for Risk Analysis.

  17. Fracture risk assessment: improved evaluation of vertebral integrity among metastatic cancer patients to aid in surgical decision-making

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.

    2012-03-01

    Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.

  18. Applicability of the Common Safety Method for Risk Evaluation and Assessment (CSM-RA) to the Space Domain

    NASA Astrophysics Data System (ADS)

    Moreira, Francisco; Silva, Nuno

    2016-08-01

    Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.

  19. Design of risk communication strategies based on risk perception among farmers exposed to pesticides in Rio de Janeiro State, Brazil.

    PubMed

    Peres, Frederico; Rodrigues, Karla Meneses; da Silva Peixoto Belo, Mariana Soares; Moreira, Josino Costa; Claudio, Luz

    2013-01-01

    This study aims to assess pesticide exposure risk perception among farmers from three rural areas of Nova Friburgo, Rio de Janeiro State, Brazil. Data were collected through semi-structured interviews with 66 adults and participatory workshops with 27 teenagers and analyzed through content analysis techniques. Systematized results were discussed at local meetings, and two risk communication initiatives were devised. Study results demonstrated the use of defensive strategies by men and a diminished risk perception among women. Teenagers relied on parents to develop their own work practices. These findings supported the importance of cultural and social determinants of farmers' understandings of risk and of the relevance of different pesticide exposure pathways. Risk perceptions and work practices are strongly influenced by local cultural patterns and, therefore, must be taken into account when developing effective intervention strategies, including risk communication initiatives. Copyright © 2012 Wiley Periodicals, Inc.

  20. EXTRACTION TECHNIQUES FOR THE REMOVAL OF ARSENICALS FROM SEAFOOD EXPOSURE MATRICES WITH ICP-MS DETECTION

    EPA Science Inventory

    Most of the existing arsenic dietary databases were developed from the analysis of total arsenic in water and dietary samples. These databases have been used to estimate arsenic exposure and in turn human health risk. However, these dietary databases are becoming obsolete as the ...

  1. Beginning Special Education Teachers: At Risk for Attrition.

    ERIC Educational Resources Information Center

    Karge, Belinda Dunnick; Freiberg, Melissa R.

    Recognizing the importance of early experience to job satisfaction and commitment, this study was conducted to investigate the effect of support from administration on the induction and retention of 457 beginning public school, special education teachers. Secondary analysis techniques were applied to information derived from the 1987-88 cross…

  2. Developing and Assessing E-Learning Techniques for Teaching Forecasting

    ERIC Educational Resources Information Center

    Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian

    2014-01-01

    In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…

  3. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    DOT National Transportation Integrated Search

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  4. Metal Pollutant Exposure and Behavior Disorders: Implications for School Practices.

    ERIC Educational Resources Information Center

    Marlowe, Mike

    1986-01-01

    The article summarizes research on relationships between low (below metal poisoning) metal exposure and childhood behavior disorders. Symptoms, assessment techniques (hair analysis), and environmental and dietary factors that may increase the risk of metal pollutant exposure are described. School programs emphasizing education and the role of…

  5. Zapateado technique as an injury risk in Mexican folkloric and Spanish dance: an analysis of execution, ground reaction force, and muscle strength.

    PubMed

    Echegoyen, Soledad; Aoyama, Takeshi; Rodríguez, Cristina

    2013-06-01

    Zapateado is a repetitive percussive footwork in dance. This percussive movement, and the differences in technique, may be risk factors for injury. A survey on zapateado dance students found a rate of 1.5 injuries/1,000 exposures. Knee injuries are more frequent than in Spanish dancers than folkloric dancers. The aim of this research was to study the relationship between technique and ground reaction force between zapateado on Spanish and Mexican folkloric dancers. Ten female dance students (age 22.4 ± 4 yrs), six Spanish dancers and four Mexican folkloric dancers, were considered. Each student performed zapateado with a flat foot, wearing high-heeled shoes during 5 seconds on a force platform. Videotapes were taken on a lateral plane, and knee and hip angles in each movement phase were measured with Dartfish software. Additionally, knee and ankle flexor and extensor strength was measured with a dynamometer. Ground reaction forces were lower for Spanish dancers than Mexican folkloric dancers. Spanish dancers had less knee flexion when the foot contacted to the ground than did Mexican folkloric dancers. On Spanish dancers, the working leg had more motion in relation to hip and knee angles than was seen in folkloric dancers. The ankle extensors were stronger on folkloric dancers, and there were no differences for the other muscle groups. Knee flexion at foot contact and muscle strength imbalance could be risk factors for injuries. It is suggested that the technique in Spanish dance in Mexico be reviewed, although more studies are required to define more risk factors.

  6. High Risk Flash Flood Rainstorm Mapping Based on Regional L-moments Approach

    NASA Astrophysics Data System (ADS)

    Ding, Hui; Liao, Yifan; Lin, Bingzhang

    2017-04-01

    Difficulties and complexities in elaborating flash flood early-warning and forecasting system prompt hydrologists to develop some techniques to substantially reduce the disastrous outcome of a flash flood in advance. An ideal to specify those areas that are subject at high risk to flash flood in terms of rainfall intensity in a relatively large region is proposed in this paper. It is accomplished through design of the High Risk Flash Flood Rainstorm Area (HRFFRA) based on statistical analysis of historical rainfall data, synoptic analysis of prevailing storm rainfalls as well as the field survey of historical flash flood events in the region. A HRFFRA is defined as the area potentially under hitting by higher intense-precipitation for a given duration with certain return period that may cause a flash flood disaster in the area. This paper has presented in detail the development of the HRFFRA through the application of the end-to-end Regional L-moments Approach (RLMA) to precipitation frequency analysis in combination with the technique of spatial interpolation in Jiangxi Province, South China Mainland. Among others, the concept of hydrometeorologically homogenous region, the precision of frequency analysis in terms of parameter estimation, the accuracy of quantiles in terms of uncertainties and the consistency adjustments of quantiles over durations and space, etc., have been addressed. At the end of this paper, the mapping of the HRFFRA and an internet-based visualized user-friendly data-server of the HRFFRA are also introduced. Key words: HRFFRA; Flash Flood; RLMA; rainfall intensity; Hydrometeorological homogenous region.

  7. Prenatal diagnosis of hemoglobinopathies: evaluation of techniques for analysing globin-chain synthesis in blood samples obtained by fetoscopy.

    PubMed Central

    Congote, L. F.; Hamilton, E. F.; Chow, J. C.; Perry, T. B.

    1982-01-01

    Three techniques for analysing hemoglobin synthesis in blood samples obtained by fetoscopy were evaluated. Of the fetuses studied, 12 were not at risk of genetic disorders, 10 were at risk of beta-thalassemia, 2 were at risk of sickle cell anemia and 1 was at risk of both diseases. The conventional method of prenatal diagnosis of hemoglobinopathies, involving the separation of globin chains labelled with a radioactive isotope on carboxymethyl cellulose (CMC) columns, was compared with a method involving globin-chain separation by high-pressure liquid chromatography (HPLC) and with direct analysis of labelled hemoglobin tetramers obtained from cell lysates by chromatography on ion-exchange columns. The last method is technically the simplest and can be used for diagnosing beta-thalassemia and sickle cell anemia. However, it gives spuriously high levels of adult hemoglobin in samples containing nonlabelled adult hemoglobin. HPLC is the fastest method for prenatal diagnosis of beta-thalassemia and may prove as reliable as the CMC method. Of the 13 fetuses at risk for hemoglobinopathies, 1 was predicted to be affected, and the diagnosis was confirmed in the abortus. Of 12 predicted to be unaffected, 1 was aborted spontaneously and was unavailable for confirmatory studies, as were 3 of the infants; however, the diagnosis was confirmed in seven cases and is awaiting confirmation when the infant in 6 months old in one case. Couples at risk of bearing a child with a hemoglobinopathy should be referred for genetic counselling before pregnancy or, at the latest, by the 12th week of gestation so that prenatal diagnosis can be attempted by amniocentesis, safer procedure, with restriction endonuclease analysis of the amniotic fluid cells. PMID:7139502

  8. Investigating the potential to reduce flood risk through catchment-based land management techniques and interventions in the River Roe catchment, Cumbria,UK

    NASA Astrophysics Data System (ADS)

    Pearson, Callum; Reaney, Sim; Bracken, Louise; Butler, Lucy

    2015-04-01

    Throughout the United Kingdom flood risk is a growing problem and a significant proportion of the population are at risk from flooding throughout the country. Across England and Wales over 5 million people are believed to be at risk from fluvial, pluvial or coastal flooding (DEFRA, 2013). Increasingly communities that have not dealt with flooding before have recently experienced significant flood events. The communities of Stockdalewath and Highbridge in the Roe catchment, a tributary of the River Eden in Cumbria, UK, are an excellent example. The River Roe has a normal flow of less than 5m3 sec-1 occurring 97 percent of the time however there have been two flash floods of 98.8m3 sec-1 in January 2005 and 86.9m3 sec-1 in May 2013. These two flash flood events resulted in the inundation of numerous properties within the catchment with the 2013 event prompting the creation of the Roe Catchment Community Water Management Group which aims are to deliver a sustainable approach to managing the flood risk. Due to the distributed rural population the community fails the cost-benefit analysis for a centrally funded flood risk mitigation scheme. Therefore the at-risk community within the Roe catchment have to look for cost-effective, sustainable techniques and interventions to reduce the potential negative impacts of future events; this has resulted in a focus on natural flood risk management. This research investigates the potential to reduce flood risk through natural catchment-based land management techniques and interventions within the Roe catchment; providing a scientific base from with further action can be enacted. These interventions include changes to land management and land use, such as soil aeration and targeted afforestation, the creation of runoff attenuation features and the construction of in channel features, such as debris dams. Natural flood management (NFM) application has been proven to be effective when reducing flood risk in smaller catchments and the potential to transfer these benefits to the Roe catchment (~69km2) have been assessed. Furthermore these flood mitigation features have the potential to deliver wider environmental improvements throughout the catchment and hence the potential for multiple benefits such as diffuse pollution reduction and habitat creation are considered. The research explores the impact of NFM techniques, flood storage areas or afforestation for example, with a view to enhancing local scale habitats. The research combines innovative catchment modelling techniques, both risk-based approaches (SCIMAP Flood) and spatially distributed hydrological simulation modelling (CRUM3), with in-field monitoring and observation of flow pathways and tributary response to rainfall using time-lapse cameras. Additional work with the local community and stakeholders will identify the range and location of potential catchment-based land management techniques and interventions being assessed; natural flood management implementation requires the participation and cooperation of landowners and local community to be successful (Howgate and Kenyon, 2009).

  9. The benefits of integrating cost-benefit analysis and risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, K.; Clarke-Whistler, K.

    1995-12-31

    It has increasingly been recognized that knowledge of risks in the absence of benefits and costs cannot dictate appropriate public policy choices. Recent evidence of this recognition includes the proposed EPA Risk Assessment and Cost-Benefit Analysis Act of 1995, a number of legislative changes in Canada and the US, and the increasing demand for field studies combining measures of impacts, risks, costs and benefits. Failure to consider relative environmental and human health risks, benefits, and costs in making public policy decisions has resulted in allocating scarce resources away from areas offering the highest levels of risk reduction and improvements inmore » health and safety. The authors discuss the implications of not taking costs and benefits into account in addressing environmental risks, drawing on examples from both Canada and the US. The authors also present the results of their recent field work demonstrating the advantages of considering costs and benefits in making public policy and site remediation decisions, including a study on the benefits and costs of prevention, remediation and monitoring techniques applied to groundwater contamination; the benefits and costs of banning the use of chlorine; and the benefits and costs of Canada`s concept of disposing of high-level nuclear waste. The authors conclude that a properly conducted Cost-Benefit Analysis can provide critical input to a Risk Assessment and can ensure that risk management decisions are efficient, cost-effective and maximize improvement to environmental and human health.« less

  10. Pros and cons of transcatheter aortic valve implantation (TAVI).

    PubMed

    Terré, Juan A; George, Isaac; Smith, Craig R

    2017-09-01

    Transcatheter aortic valve implantation (TAVI) or replacement (TAVR) was recently approved by the FDA for intermediate risk patients with severe aortic stenosis (AS). This technique was already worldwide adopted for inoperable and high-risk patients. Improved device technology, imaging analysis and operator expertise has reduced the initial worrisome higher complications rate associated with TAVR, making it comparable to surgical aortic valve replacement (SAVR). However, many answers need to be addressed before adoption in lower risk patients. This paper highlights the pros and cons of TAVI based mostly on randomized clinical trials involving the two device platforms approved in the United States. We focused our analysis on metrics that will play a key role in expanding TAVR indication in healthier individuals. We review the significance and gave a perspective on paravalvular leak (PVL), valve performance, valve durability, leaflet thrombosis, stroke and pacemaker requirement.

  11. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  12. Risk Factors for Social Networking Site Scam Victimization Among Malaysian Students.

    PubMed

    Kirwan, Gráinne H; Fullwood, Chris; Rooney, Brendan

    2018-02-01

    Social networking sites (SNSs) can provide cybercriminals with various opportunities, including gathering of user data and login credentials to enable fraud, and directing of users toward online locations that may install malware onto their devices. The techniques employed by such cybercriminals can include clickbait (text or video), advertisement of nonexistent but potentially desirable products, and hoax competitions/giveaways. This study aimed to identify risk factors associated with falling victim to these malicious techniques. An online survey was completed by 295 Malaysian undergraduate students, finding that more than one-third had fallen victim to SNS scams. Logistic regression analysis identified several victimization risk factors including having higher scores in impulsivity (specifically cognitive complexity), using fewer devices for SNSs, and having been on an SNS for a longer duration. No reliable model was found for vulnerability to hoax valuable gift giveaways and "friend view application" advertising specifically, but vulnerability to video clickbait was predicted by lower extraversion scores, higher levels of openness to experience, using fewer devices, and being on an SNS for a longer duration. Other personality traits were not associated with either overall victimization susceptibility or increased risk of falling victim to the specific techniques. However, age approached significance within both the video clickbait and overall victimization models. These findings suggest that routine activity theory may be particularly beneficial in understanding and preventing SNSs scam victimization.

  13. The QUALYOR (QUalité Osseuse LYon Orléans) study: a new cohort for non invasive evaluation of bone quality in postmenopausal osteoporosis. Rationale and study design.

    PubMed

    Chapurlat, Roland; Pialat, Jean-Baptiste; Merle, Blandine; Confavreux, Elisabeth; Duvert, Florence; Fontanges, Elisabeth; Khacef, Farida; Peres, Sylvie Loiseau; Schott, Anne-Marie; Lespessailles, Eric

    2017-12-27

    The diagnostic performance of densitometry is inadequate. New techniques of non-invasive evaluation of bone quality may improve fracture risk prediction. Testing the value of these techniques is the goal of the QUALYOR cohort. The bone mineral density (BMD) of postmenopausal women who sustain osteoporotic fracture is generally above the World Health Organization definition for osteoporosis. Therefore, new approaches to improve the detection of women at high risk for fracture are warranted. We have designed and recruited a new cohort to assess the predictive value of several techniques to assess bone quality, including high-resolution peripheral quantitative computerized tomography (HRpQCT), hip QCT, calcaneus texture analysis, and biochemical markers. We have enrolled 1575 postmenopausal women, aged at least 50, with an areal BMD femoral neck or lumbar spine T-score between - 1.0 and - 3.0. Clinical risk factors for fracture have been collected along with serum and blood samples. We describe the design of the QUALYOR study. Among these 1575 women, 80% were aged at least 60. The mean femoral neck T-score was - 1.6 and the mean lumbar spine T-score was -1.2. This cohort is currently being followed up. QUALYOR will provide important information on the relationship between bone quality variables and fracture risk in women with moderately decreased BMD.

  14. Recurrent tricuspid insufficiency: is the surgical repair technique a risk factor?

    PubMed

    Kara, Ibrahim; Koksal, Cengiz; Cakalagaoglu, Canturk; Sahin, Muslum; Yanartas, Mehmet; Ay, Yasin; Demir, Serdar

    2013-01-01

    This study compares the medium-term results of De Vega, modified De Vega, and ring annuloplasty techniques for the correction of tricuspid insufficiency and investigates the risk factors for recurrent grades 3 and 4 tricuspid insufficiency after repair. In our clinic, 93 patients with functional tricuspid insufficiency underwent surgical tricuspid repair from May 2007 through October 2010. The study was retrospective, and all the data pertaining to the patients were retrieved from hospital records. Functional capacity, recurrent tricuspid insufficiency, and risk factors aggravating the insufficiency were analyzed for each patient. In the medium term (25.4 ± 10.3 mo), the rates of grades 3 and 4 tricuspid insufficiency in the De Vega, modified De Vega, and ring annuloplasty groups were 31%, 23.1%, and 6.1%, respectively. Logistic regression analysis revealed that chronic obstructive pulmonary disease, left ventricular dysfunction (ejection fraction, < 0.50), pulmonary artery pressure ≥60 mmHg, and the De Vega annuloplasty technique were risk factors for medium-term recurrent grades 3 and 4 tricuspid insufficiency. Medium-term survival was 90.6% for the De Vega group, 96.3% for the modified De Vega group, and 97.1% for the ring annuloplasty group. Ring annuloplasty provided the best relief from recurrent tricuspid insufficiency when compared with DeVega annuloplasty. Modified De Vega annuloplasty might be a suitable alternative to ring annuloplasty when rings are not available.

  15. Threat and error management for anesthesiologists: a predictive risk taxonomy

    PubMed Central

    Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas

    2015-01-01

    Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268

  16. Infrared spectroscopy as a screening technique for colitis

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Ghimire, Hemendra; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil

    2017-05-01

    There remains a great need for diagnosis of inflammatory bowel disease (IBD), for which the current technique, colonoscopy, is not cost-effective and presents a non-negligible risk for complications. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy is a new screening technique to evaluate colitis. Comparing infrared spectra of sera to study the differences between them can prove challenging due to the complexity of its biological constituents giving rise to a plethora of vibrational modes. Overcoming these inherent infrared spectral analysis difficulties involving highly overlapping absorbance peaks and the analysis of the data by curve fitting to improve the resolution is discussed. The proposed technique uses colitic and normal wild type mice dried serum to obtain ATR/FTIR spectra to effectively differentiate colitic mice from normal mice. Using this method, Amide I group frequency (specifically, alpha helix to beta sheet ratio of the protein secondary structure) was identified as disease associated spectral signature in addition to the previously reported glucose and mannose signatures in sera of chronic and acute mice models of colitis. Hence, this technique will be able to identify changes in the sera due to various diseases.

  17. Multiparametric evaluation of risk factors associated to seroma formation in abdominal wall surgery.

    PubMed

    Licari, L; Salamone, G; Parinisi, Z; Campanella, S; Sabatino, C; Ciolino, G; De Marco, P; Falco, N; Boventre, S; Gulotta, G

    2017-01-01

    Incisional hernia is one of the main topics in the general surgery since there is not a unanimous consensus concerning to the best surgical methodology to adopt. It seems that prosthetic surgery is the best technique, even if responsible for the development of periprosthetic seroma. The aim of this study is to assess whether the preoperative abnormalities of the bio-humoral parameters may be considered as risk factors for seroma. From July 2016 to July 2017 at the "Policlinico Paolo Giaccone", Palermo, Department of Emergency Surgery, 56 patients included in this study, underwent laparotomic mesh repair. The inclusion criteria were: age > 18 years, incisional hernia W2R0 according to the Chevrel classification and a monoperator technique. The main variables were: sex, age, BMI, smoke, ASA score, and co-morbidities. Among the main serum-blood variables: natraemia, kalaemia, chloraemia, calcaemia, PCR, level of glucose, creatinine, albumin and proteins in the blood. The data were analyzed using SPSS software. Univariate analysis highlighted hypo- and hyper-natraemia, hyper-kalaemia, hypo-chloraemia, high levels of PCR, hyper-glycemia, low level of serum-blood albumin and proteins, as statistically significant variables. Multivariate analysis revealed a p<0.05 for PCR, hypo-albuminemia and total serum-blood-protein level. Alterations of pre-operative bio-humoral parameters could be associated to a greater risk of seroma development. A better understanding of such alterations may lead to more efficient risk stratification methods. This could be essential to better address the medical resources, reducing the post-operative complications and the outpatient controls as well as the risk associated to seroma.

  18. Surveying the interest of individuals with upper limb loss in novel prosthetic control techniques.

    PubMed

    Engdahl, Susannah M; Christie, Breanne P; Kelly, Brian; Davis, Alicia; Chestek, Cynthia A; Gates, Deanna H

    2015-06-13

    Novel techniques for the control of upper limb prostheses may allow users to operate more complex prostheses than those that are currently available. Because many of these techniques are surgically invasive, it is important to understand whether individuals with upper limb loss would accept the associated risks in order to use a prosthesis. An online survey of individuals with upper limb loss was conducted. Participants read descriptions of four prosthetic control techniques. One technique was noninvasive (myoelectric) and three were invasive (targeted muscle reinnervation, peripheral nerve interfaces, cortical interfaces). Participants rated how likely they were to try each technique if it offered each of six different functional features. They also rated their general interest in each of the six features. A two-way repeated measures analysis of variance with Greenhouse-Geisser corrections was used to examine the effect of the technique type and feature on participants' interest in each technique. Responses from 104 individuals were analyzed. Many participants were interested in trying the techniques - 83 % responded positively toward myoelectric control, 63 % toward targeted muscle reinnervation, 68 % toward peripheral nerve interfaces, and 39 % toward cortical interfaces. Common concerns about myoelectric control were weight, cost, durability, and difficulty of use, while the most common concern about the invasive techniques was surgical risk. Participants expressed greatest interest in basic prosthesis features (e.g., opening and closing the hand slowly), as opposed to advanced features like fine motor control and touch sensation. The results of these investigations may be used to inform the development of future prosthetic technologies that are appealing to individuals with upper limb loss.

  19. Review Article: Multi-criteria decision making for flood risk management: a survey of the current state-of-the-art

    NASA Astrophysics Data System (ADS)

    de Brito, M. M.; Evers, M.

    2015-11-01

    This paper provides a review of Multi-Criteria Decision Making (MCDM) applications to flood risk management, seeking to highlight trends and identify research gaps. Totally, 128 peer-reviewed papers published from 1995 to June 2015 were systematically analysed and classified into the following application areas: (1) ranking of alternatives for flood mitigation, (2) reservoir flood control, (3) susceptibility, (4) hazard, (5) vulnerability, (6) risk, (7) coping capacity, and (8) emergency management. Additionally, the articles were categorized based on the publication year, MCDM method, whether they were or were not carried out in a participatory process, and if uncertainty and sensitivity analysis were performed. Results showed that the number of flood MCDM publications has exponentially grown during this period, with over 82 % of all papers published since 2009. The Analytical Hierarchy Process (AHP) was the most popular technique, followed by Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS), and Simple Additive Weighting (SAW). Although there is greater interest on MCDM, uncertainty analysis remains an issue and is seldom applied in flood-related studies. In addition, participation of multiple stakeholders has been generally fragmented, focusing on particular stages of the decision-making process, especially on the definition of criteria weights. Based on the survey, some suggestions for further investigation are provided.

  20. Insufficient Knowledge of Breast Cancer Risk Factors Among Malaysian Female University Students

    PubMed Central

    Samah, Asnarulkhadi Abu; Ahmadian, Maryam; Latiff, Latiffah A.

    2016-01-01

    Background: Despite continuous argument about the efficacy of breast self-examination; it still could be a life-saving technique through inspiring and empowering women to take better control over their body/breast and health. This study investigated Malaysian female university students’ knowledge about breast cancer risk factors, signs, and symptoms and assessed breast self-examination frequency among students. Method: A cross-sectional survey was conducted in 2013 in nine public and private universities in the Klang Valley and Selangor. 842 female students were respondents for the self-administered survey technique. Simple descriptive and inferential statistics were employed for data analysis. Results: The uptake of breast self-examination (BSE) was less than 50% among the students. Most of students had insufficient knowledge on several breast cancer risk factors. Conclusion: Actions and efforts should be done to increase knowledge of breast cancer through the development of ethnically and traditionally sensitive educational training on BSE and breast cancer literacy. PMID:26234996

  1. Audience segmentation as a social-marketing tool in health promotion: use of the risk perception attitude framework in HIV prevention in Malawi.

    PubMed

    Rimal, Rajiv N; Brown, Jane; Mkandawire, Glory; Folda, Lisa; Böse, Kirsten; Creel, Alisha H

    2009-12-01

    We sought to determine whether individuals' risk perceptions and efficacy beliefs could be used to meaningfully segment audiences to assist interventions that seek to change HIV-related behaviors. A household-level survey of individuals (N=968) was conducted in 4 districts in Malawi. On the basis of responses about perceptions of risk and beliefs about personal efficacy, we used cluster analysis to create 4 groups within the risk perception attitude framework: responsive (high risk, strong efficacy), avoidance (high risk, weak efficacy), proactive (low risk, strong efficacy), and indifference (low risk, weak efficacy). We ran analysis of covariance models (controlling for known predictors) to determine how membership in the risk perception attitude framework groups would affect knowledge about HIV, HIV-testing uptake, and condom use. A significant association was found between membership in 1 or more of the 4 risk perception attitude framework groups and the 3 study variables of interest: knowledge about HIV (F8, 956=20.77; P<.001), HIV testing uptake (F8, 952=10.91; P<.001), and condom use (F8, 885=29.59; P<.001). The risk perception attitude framework can serve as a theoretically sound audience segmentation technique that can be used to determine whether messages should augment perceptions of risk, beliefs about personal efficacy, or both.

  2. Audience Segmentation as a Social-Marketing Tool in Health Promotion: Use of the Risk Perception Attitude Framework in HIV Prevention in Malawi

    PubMed Central

    Brown, Jane; Mkandawire, Glory; Folda, Lisa; Böse, Kirsten; Creel, Alisha H.

    2009-01-01

    Objectives. We sought to determine whether individuals' risk perceptions and efficacy beliefs could be used to meaningfully segment audiences to assist interventions that seek to change HIV-related behaviors. Methods. A household-level survey of individuals (N = 968) was conducted in 4 districts in Malawi. On the basis of responses about perceptions of risk and beliefs about personal efficacy, we used cluster analysis to create 4 groups within the risk perception attitude framework: responsive (high risk, strong efficacy), avoidance (high risk, weak efficacy), proactive (low risk, strong efficacy), and indifference (low risk, weak efficacy). We ran analysis of covariance models (controlling for known predictors) to determine how membership in the risk perception attitude framework groups would affect knowledge about HIV, HIV-testing uptake, and condom use. Results. A significant association was found between membership in 1 or more of the 4 risk perception attitude framework groups and the 3 study variables of interest: knowledge about HIV (F8, 956 = 20.77; P < .001), HIV testing uptake (F8, 952 = 10.91; P < .001), and condom use (F8, 885 = 29.59; P < .001). Conclusions. The risk perception attitude framework can serve as a theoretically sound audience segmentation technique that can be used to determine whether messages should augment perceptions of risk, beliefs about personal efficacy, or both. PMID:19833992

  3. Feature selection through validation and un-censoring of endovascular repair survival data for predicting the risk of re-intervention.

    PubMed

    Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong

    2017-08-03

    Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0.05, meaning that patients with different risk groups can be separated significantly and those who would need re-intervention can be correctly predicted. The proposed approach will save time and effort made by physicians to collect unnecessary variables. The final reduced model was able to predict the long-term risk of aortic complications after EVAR. This predictive model can help clinicians decide patients' future observation plan.

  4. [Mannheim Rating Scales for the analysis of mother-child interaction in toddlers].

    PubMed

    Dinter-Jörg, M; Polowczyk, M; Herrle, J; Esser, G; Laucht, M; Schmidt, M H

    1997-12-01

    As part of a prospective study on child development from birth to age 11 the Mannheim Rating Scales for the Analysis of Mother-Child Interaction in Toddlers was constructed. Ten-minute interactions of 352 mothers and their toddlers were videotaped in the laboratory and evaluated with micro- and macroanalytic techniques. The instrument consists of a combination of second-by-second codings and dimensional ratings of 5-second to 1 minute periods. Interrater reliability, assessed by having two raters analyze 16 mother-child dyads, proved satisfactory. Psychosocial risk showed different patterns from those at low risk. Interactions of mothers and daughters seemed to be more harmonious than interactions of mothers and sons.

  5. Approaches to reduce urinary tract injury during management of placenta accreta, increta, and percreta: a systematic review.

    PubMed

    Tam Tam, Kiran Babu; Dozier, James; Martin, James Nello

    2012-04-01

    A systematic review of the literature was conducted to answer the following question: are there enhancements to standard peripartum hysterectomy technique that minimize unintentional urinary tract (UT) injury in pregnancies complicated by invasive placental attachment (INPLAT)? A PubMed search of English language articles on INPLAT published by June 2010 was conducted. Data regarding the following parameters was required for inclusion in the quantitative analysis of the review's objective: (1) type of INPLAT, (2) details pertaining to medical and surgical management of INPLAT, and (3) complications, if any, associated with management. An attempt was made to identify approaches that may lower the risk of unintentional UT injury. Most cases (285 of 292) were managed by hysterectomy. There were 83 (29%) cases of unintentional UT injury. Antenatal diagnosis of INPLAT lowered the rate of UT injury (39% vs. 63%; P = 0.04). Information regarding surgical technique or medical management was available for 90 cases; 14 of these underwent a standard hysterectomy technique. Methotrexate treatment and 11 modifications of the surgical technique were associated with 16% unintentional UT injury rate as opposed to 57% for standard hysterectomy (P = 0.002). The use of ureteral stents reduced risk of urologic injury (P = 0.01). Multiple logistic regression analysis identified antenatal diagnosis as the significant predictor of an intact UT. Antenatal diagnosis of INPLAT is paramount to minimize UT injury. Utilization of management modifications identified in this review may reduce urologic injury due to INPLAT.

  6. Taller height as a risk factor for venous thromboembolism: a Mendelian randomization meta-analysis.

    PubMed

    Roetker, N S; Armasu, S M; Pankow, J S; Lutsey, P L; Tang, W; Rosenberg, M A; Palmer, T M; MacLehose, R F; Heckbert, S R; Cushman, M; de Andrade, M; Folsom, A R

    2017-07-01

    Essentials Observational data suggest taller people have a higher risk of venous thromboembolism (VTE). We used Mendelian randomization techniques to further explore this association in three studies. Risk of VTE increased by 30-40% for each 10 cm increment in height. Height was more strongly associated with deep vein thrombosis than with pulmonary embolism. Background Taller height is associated with a greater risk of venous thromboembolism (VTE). Objectives To use instrumental variable (IV) techniques (Mendelian randomization) to further explore this relationship. Methods Participants of European ancestry were included from two cohort studies (Atherosclerosis Risk in Communities [ARIC] study and Cardiovascular Health Study [CHS]) and one case-control study (Mayo Clinic VTE Study [Mayo]). We created two weighted genetic risk scores (GRSs) for height; the full GRS included 668 single-nucleotide polymorphisms (SNPs) from a previously published meta-analysis, and the restricted GRS included a subset of 362 SNPs not associated with weight independently of height. Standard logistic regression and IV models were used to estimate odds ratios (ORs) for VTE per 10-cm increment in height. ORs were pooled across the three studies by the use of inverse variance-weighted random effects meta-analysis. Results Among 9143 ARIC and 3180 CHS participants free of VTE at baseline, there were 367 and 109 incident VTE events. There were 1143 VTE cases and 1292 controls included from Mayo. The pooled ORs from non-IV models and models using the full and restricted GRSs as IVs were 1.27 (95% confidence interval [CI] 1.11-1.46), 1.34 (95% CI 1.04-1.73) and 1.45 (95% CI 1.04-2.01) per 10-cm greater height, respectively. Conclusions Taller height is associated with an increased risk of VTE in adults of European ancestry. Possible explanations for this association, including taller people having a greater venous surface area, a higher number of venous valves, or greater hydrostatic pressure, need to be explored further. © 2017 International Society on Thrombosis and Haemostasis.

  7. MPATHav: A software prototype for multiobjective routing in transportation risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganter, J.H.; Smith, J.D.

    Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less

  8. Use of labour induction and risk of cesarean delivery: a systematic review and meta-analysis

    PubMed Central

    Mishanina, Ekaterina; Rogozinska, Ewelina; Thatthi, Tej; Uddin-Khan, Rehan; Khan, Khalid S.; Meads, Catherine

    2014-01-01

    Background: Induction of labour is common, and cesarean delivery is regarded as its major complication. We conducted a systematic review and meta-analysis to investigate whether the risk of cesarean delivery is higher or lower following labour induction compared with expectant management. Methods: We searched 6 electronic databases for relevant articles published through April 2012 to identify randomized controlled trials (RCTs) in which labour induction was compared with placebo or expectant management among women with a viable singleton pregnancy. We assessed risk of bias and obtained data on rates of cesarean delivery. We used regression analysis techniques to explore the effect of patient characteristics, induction methods and study quality on risk of cesarean delivery. Results: We identified 157 eligible RCTs (n = 31 085). Overall, the risk of cesarean delivery was 12% lower with labour induction than with expectant management (pooled relative risk [RR] 0.88, 95% confidence interval [CI] 0.84–0.93; I2 = 0%). The effect was significant in term and post-term gestations but not in preterm gestations. Meta-regression analysis showed that initial cervical score, indication for induction and method of induction did not alter the main result. There was a reduced risk of fetal death (RR 0.50, 95% CI 0.25–0.99; I2 = 0%) and admission to a neonatal intensive care unit (RR 0.86, 95% CI 0.79–0.94), and no impact on maternal death (RR 1.00, 95% CI 0.10–9.57; I2 = 0%) with labour induction. Interpretation: The risk of cesarean delivery was lower among women whose labour was induced than among those managed expectantly in term and post-term gestations. There were benefits for the fetus and no increased risk of maternal death. PMID:24778358

  9. Comparative Risk Analysis for Metropolitan Solid Waste Management Systems

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Wang, S. F.

    1996-01-01

    Conventional solid waste management planning usually focuses on economic optimization, in which the related environmental impacts or risks are rarely considered. The purpose of this paper is to illustrate the methodology of how optimization concepts and techniques can be applied to structure and solve risk management problems such that the impacts of air pollution, leachate, traffic congestion, and noise increments can be regulated in the iong-term planning of metropolitan solid waste management systems. Management alternatives are sequentially evaluated by adding several environmental risk control constraints stepwise in an attempt to improve the management strategies and reduce the risk impacts in the long run. Statistics associated with those risk control mechanisms are presented as well. Siting, routing, and financial decision making in such solid waste management systems can also be achieved with respect to various resource limitations and disposal requirements.

  10. Bad splits in bilateral sagittal split osteotomy: systematic review and meta-analysis of reported risk factors.

    PubMed

    Steenen, S A; van Wijk, A J; Becking, A G

    2016-08-01

    An unfavourable and unanticipated pattern of the bilateral sagittal split osteotomy (BSSO) is generally referred to as a 'bad split'. Patient factors predictive of a bad split reported in the literature are controversial. Suggested risk factors are reviewed in this article. A systematic review was undertaken, yielding a total of 30 studies published between 1971 and 2015 reporting the incidence of bad split and patient age, and/or surgical technique employed, and/or the presence of third molars. These included 22 retrospective cohort studies, six prospective cohort studies, one matched-pair analysis, and one case series. Spearman's rank correlation showed a statistically significant but weak correlation between increasing average age and increasing occurrence of bad splits in 18 studies (ρ=0.229; P<0.01). No comparative studies were found that assessed the incidence of bad split among the different splitting techniques. A meta-analysis pooling the effect sizes of seven cohort studies showed no significant difference in the incidence of bad split between cohorts of patients with third molars present and concomitantly removed during surgery, and patients in whom third molars were removed at least 6 months preoperatively (odds ratio 1.16, 95% confidence interval 0.73-1.85, Z=0.64, P=0.52). In summary, there is no robust evidence to date to show that any risk factor influences the incidence of bad split. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  11. Draft SEI Program Plans: 1995-1999

    DTIC Science & Technology

    1994-08-01

    risk management because we believe that (a) structured techniques, even quite simple ones, can be effective in identifying and quantifying risk ; and (b...belief that (1) structured techniques, even quite simple ones, could be effective in identifying and quantifying risk ; and (2) techniques existed to

  12. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.

  13. Risk of Revision Was Not Reduced by a Double-bundle ACL Reconstruction Technique: Results From the Scandinavian Registers.

    PubMed

    Aga, Cathrine; Kartus, Jüri-Tomas; Lind, Martin; Lygre, Stein Håkon Låstad; Granan, Lars-Petter; Engebretsen, Lars

    2017-10-01

    Double-bundle anterior cruciate ligament (ACL) reconstruction has demonstrated improved biomechanical properties and moderately better objective outcomes compared with single-bundle reconstructions. This could make an impact on the rerupture rate and reduce the risk of revisions in patients undergoing double-bundle ACL reconstruction compared with patients reconstructed with a traditional single-bundle technique. The National Knee Ligament Registers in Scandinavia provide information that can be used to evaluate the revision outcome after ACL reconstructions. The purposes of the study were (1) to compare the risk of revision between double-bundle and single-bundle reconstructions, reconstructed with autologous hamstring tendon grafts; (2) to compare the risk of revision between double-bundle hamstring tendon and single-bundle bone-patellar tendon-bone autografts; and (3) to compare the hazard ratios for the same two research questions after Cox regression analysis was performed. Data collection of primary ACL reconstructions from the National Knee Ligament Registers in Denmark, Norway, and Sweden from July 1, 2005, to December 31, 2014, was retrospectively analyzed. A total of 60,775 patients were included in the study; 994 patients were reconstructed with double-bundle hamstring tendon grafts, 51,991 with single-bundle hamstring tendon grafts, and 7790 with single-bundle bone-patellar tendon-bone grafts. The double-bundle ACL-reconstructed patients were compared with the two other groups. The risk of revision for each research question was detected by the risk ratio, hazard ratio, and the corresponding 95% confidence intervals. Kaplan-Meier analysis was used to estimate survival at 1, 2, and 5 years for the three different groups. Furthermore, a Cox proportional hazard regression model was applied and the hazard ratios were adjusted for country, age, sex, meniscal or chondral injury, and utilized fixation devices on the femoral and tibial sides. There were no differences in the crude risk of revision between the patients undergoing the double-bundle technique and the two other groups. A total of 3.7% patients were revised in the double-bundle group (37 of 994 patients) versus 3.8% in the single-bundle hamstring tendon group (1952 of 51,991; risk ratio, 1.01; 95% confidence interval (CI), 0.73-1.39; p = 0.96), and 2.8% of the patients were revised in the bone-patellar tendon-bone group (219 of the 7790 bone-patellar tendon-bone patients; risk ratio, 0.76; 95% CI, 0.54-1.06; p = 0.11). Cox regression analysis with adjustment for country, age, sex, menisci or cartilage injury, and utilized fixation device on the femoral and tibial sides, did not reveal any further difference in the risk of revision between the single-bundle hamstring tendon and double-bundle hamstring tendon groups (hazard ratio, 1.18; 95% CI, 0.85-1.62; p = 0.33), but the adjusted hazard ratio showed a lower risk of revision in the single-bundle bone-patellar tendon-bone group compared with the double-bundle group (hazard ratio, 0.62; 95% CI, 0.43-0.90; p = 0.01). Comparisons of the graft revision rates reported separately for each country revealed that double-bundle hamstring tendon reconstructions in Sweden had a lower hazard ratio compared with the single-bundle hamstring tendon reconstructions (hazard ratio, 1.00 versus 1.89; 95% CI, 1.09-3.29; p = 0.02). Survival at 5 years after index surgery was 96.0% for the double-bundle group, 95.4% for the single-bundle hamstring tendon group, and 97.0% for the single-bundle bone-patellar tendon-bone group. Based on the data from all three national registers, the risk of revision was not influenced by the reconstruction technique in terms of using single- or double-bundle hamstring tendons, although national differences in survival existed. Using bone-patellar tendon-bone grafts lowered the risk of revision compared with double-bundle hamstring tendon grafts. These findings should be considered when deciding what reconstruction technique to use in ACL-deficient knees. Future studies identifying the reasons for graft rerupture in single- and double-bundle reconstructions would be of interest to understand the findings of the present study. Level III, therapeutic study.

  14. An Overview of NASA's Oribital Debris Environment Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    Using updated measurement data, analysis tools, and modeling techniques; the NASA Orbital Debris Program Office has created a new Orbital Debris Environment Model. This model extends the coverage of orbital debris flux throughout the Earth orbit environment, and includes information on the mass density of the debris as well as the uncertainties in the model environment. This paper will give an overview of this model and its implications for spacecraft risk analysis.

  15. Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand

    NASA Astrophysics Data System (ADS)

    Kaiser, G.; Kortenhaus, A.

    2009-04-01

    The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world-wide available. However, to model tsunami-induced inundation for risk analysis and management purposes the accuracy of these data is not sufficient as the processes in the near-shore zone cannot be modelled accurately enough and the spatial resolution of the topography is weak. Moreover, the SRTM data provide a digital surface model which includes vegetation and buildings in the surface description. To improve the data basis additional bathymetric data were used in the near shore zone of the Phang Nga and Phuket coastlines and various remote sensing techniques as well as additional GPS measurements were applied to derive a high resolution topography from satellite and airborne data. Land use classifications and filter methods were developed to correct the digital surface models to digital elevation models. Simulations were then performed with a non-linear shallow water model to model the 2004 Asian Tsunami and to simulate possible future ones. Results of water elevation near the coast were compared with field measurements and observations, and the influence of the resolution of the topography on inundation patterns like water depth, velocity, dispersion and duration of the flood were analysed. The inundation simulation provides detailed hazard maps and is considered a reliable basis for risk assessment and risk zone mapping. Results are regarded vital for estimation of tsunami induced damages and evacuation planning. Results of the aforementioned simulations will be discussed during the conference. Differences of the numerical results using topographic data of different scales and modified by different post processing techniques will be analysed and explained. Further use of the results with respect to tsunami risk analysis and management will also be demonstrated.

  16. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  17. Progress of statistical analysis in biomedical research through the historical review of the development of the Framingham score.

    PubMed

    Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija

    2017-12-02

    The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.

  18. Spatial analysis and health risk assessment of heavy metals concentration in drinking water resources.

    PubMed

    Fallahzadeh, Reza Ali; Ghaneian, Mohammad Taghi; Miri, Mohammad; Dashti, Mohamad Mehdi

    2017-11-01

    The heavy metals available in drinking water can be considered as a threat to human health. Oncogenic risk of such metals is proven in several studies. Present study aimed to investigate concentration of the heavy metals including As, Cd, Cr, Cu, Fe, Hg, Mn, Ni, Pb, and Zn in 39 water supply wells and 5 water reservoirs within the cities Ardakan, Meibod, Abarkouh, Bafgh, and Bahabad. The spatial distribution of the concentration was carried out by the software ArcGIS. Such simulations as non-carcinogenic hazard and lifetime cancer risk were conducted for lead and nickel using Monte Carlo technique. The sensitivity analysis was carried out to find the most important and effective parameters on risk assessment. The results indicated that concentration of all metals in 39 wells (except iron in 3 cases) reached the levels mentioned in EPA, World Health Organization, and Pollution Control Department standards. Based on the spatial distribution results at all studied regions, the highest concentrations of metals were derived, respectively, for iron and zinc. Calculated HQ values for non-carcinogenic hazard indicated a reasonable risk. Average lifetime cancer risks for the lead in Ardakan and nickel in Meibod and Bahabad were shown to be 1.09 × 10 -3 , 1.67 × 10 -1 , and 2 × 10 -1 , respectively, demonstrating high carcinogenic risk compared to similar standards and studies. The sensitivity analysis suggests high impact of concentration and BW in carcinogenic risk.

  19. Modelling second malignancy risks from low dose rate and high dose rate brachytherapy as monotherapy for localised prostate cancer.

    PubMed

    Murray, Louise; Mason, Joshua; Henry, Ann M; Hoskin, Peter; Siebert, Frank-Andre; Venselaar, Jack; Bownes, Peter

    2016-08-01

    To estimate the risks of radiation-induced rectal and bladder cancers following low dose rate (LDR) and high dose rate (HDR) brachytherapy as monotherapy for localised prostate cancer and compare to external beam radiotherapy techniques. LDR and HDR brachytherapy monotherapy plans were generated for three prostate CT datasets. Second cancer risks were assessed using Schneider's concept of organ equivalent dose. LDR risks were assessed according to a mechanistic model and a bell-shaped model. HDR risks were assessed according to a bell-shaped model. Relative risks and excess absolute risks were estimated and compared to external beam techniques. Excess absolute risks of second rectal or bladder cancer were low for both LDR (irrespective of the model used for calculation) and HDR techniques. Average excess absolute risks of rectal cancer for LDR brachytherapy according to the mechanistic model were 0.71 per 10,000 person-years (PY) and 0.84 per 10,000 PY respectively, and according to the bell-shaped model, were 0.47 and 0.78 per 10,000 PY respectively. For HDR, the average excess absolute risks for second rectal and bladder cancers were 0.74 and 1.62 per 10,000 PY respectively. The absolute differences between techniques were very low and clinically irrelevant. Compared to external beam prostate radiotherapy techniques, LDR and HDR brachytherapy resulted in the lowest risks of second rectal and bladder cancer. This study shows both LDR and HDR brachytherapy monotherapy result in low estimated risks of radiation-induced rectal and bladder cancer. LDR resulted in lower bladder cancer risks than HDR, and lower or similar risks of rectal cancer. In absolute terms these differences between techniques were very small. Compared to external beam techniques, second rectal and bladder cancer risks were lowest for brachytherapy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less

  1. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  3. Cognitive task analysis: harmonizing tasks to human capacities.

    PubMed

    Neerincx, M A; Griffioen, E

    1996-04-01

    This paper presents the development of a cognitive task analysis that assesses the task load of jobs and provides indicators for the redesign of jobs. General principles of human task performance were selected and, subsequently, integrated into current task modelling techniques. The resulting cognitive task analysis centres around four aspects of task load: the number of actions in a period, the ratio between knowledge- and rule-based actions, lengthy uninterrupted actions, and momentary overloading. The method consists of three stages: (1) construction of a hierarchical task model, (2) a time-line analysis and task load assessment, and (3), if necessary, adjustment of the task model. An application of the cognitive task analysis in railway traffic control showed its benefits over the 'old' task load analysis of the Netherlands Railways. It provided a provisional standard for traffic control jobs, conveyed two load risks -- momentary overloading and underloading -- and resulted in proposals to satisfy the standard and to diminish the two load risk.

  4. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  5. Measurement techniques of exposure to nanomaterials in the workplace for low- and medium-income countries: A systematic review.

    PubMed

    Boccuni, Fabio; Gagliardi, Diana; Ferrante, Riccardo; Rondinone, Bruna Maria; Iavicoli, Sergio

    2017-10-01

    Nanotechnology offers many opportunities but there is still considerable uncertainty about the health risks and how to assess these.In the field of risk analysis for workers potentially exposed to nano-objects and their agglomerates and aggregates (NOAA) different methodological approaches to measure airborne NOAA have been proposed.This study proposes a systematic review of scientific literature on occupational exposure to NOAA in the workplace with the aim to identify techniques of exposure measurement to be recommended in low- and medium-income countries.We gathered scientific papers reporting techniques of NOAA exposure measurements in the workplace, we summarized the data for each eligible technique according to PRISMA guidelines, and we rated the quality of evidence following an adapted GRADE approach.We found 69 eligible studies to be included in qualitative synthesis: the majority of studies reported a moderate quality and only two studies demonstrated the use of a high quality exposure measurement technique.The review demonstrates that a basic exposure measurement, i.e. evidence for the presence or absence of NOAA in the workplace air, can be achieved with moderate (40 techniques) to high (2 techniques) quality; comprehensive exposure measurement, that allow the quantification of NOAA in the workplace, can be achieved with moderate (11 techniques) to high (2 techniques) quality.The findings of the study also allowed to finalize a list of requirements that must be fulfilled by an effective measurement technique (either basic or comprehensive) and to highlight the main weaknesses that need to be tackled for an effective affordability evaluation of measurement techniques to be recommended in low- and medium-income countries. Copyright © 2017 Elsevier GmbH. All rights reserved.

  6. Managing Student Loan Default Risk: Evidence from a Privately Guaranteed Portfolio.

    ERIC Educational Resources Information Center

    Monteverde, Kirk

    2000-01-01

    Application of the statistical techniques of survival analysis and credit scoring to private education loans extended to law students found a pronounced seasoning effect for such loans and the robust predictive power of credit bureau scoring of borrowers. Other predictors of default included school-of-attendance, school's geographic location, and…

  7. A Menu Technique for Utilizing VERT Interactively

    DTIC Science & Technology

    1982-07-01

    use. One reason for this lies not with VERT, but with the in- adequate understanding of risk-analysis concepts in gen - eral.14 Many program managers...FIELD ft, 12) IFhtF = 1FFCF ♦ 1 RLTLPN TKArSFtH Ctlfi EiTFER R1GFT OR LEFT FANC JLSI IF ICATILN CL lr.33 I

  8. Reduction of Risk in Exploration and Prospect Generation through a Multidisciplinary Basin-Analysis Program in the South-Central Mid-Continent Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banerjee, S.; Barker, C.; Fite, J.

    1999-04-02

    This report will discuss a series of regional studies that were undertaken within the South-Central Mid-Continent region of the U.S. Coverage is also provided about a series of innovative techniques that were used for this assessment.

  9. To Spray or Not To Spray? A Debate Over DDT.

    ERIC Educational Resources Information Center

    Dinan, Frank J.; Bieron, Joseph F.

    2001-01-01

    Presents an activity in which students grapple with the complex issues surrounding the use of DDT to control malaria which affects millions of people in developing nations. Considers risk/benefit analysis and the pre-cautionary principle, two techniques used when making policy decisions involving the impact of science and technology on society.…

  10. Spatial analysis of falls in an urban community of Hong Kong

    PubMed Central

    Lai, Poh C; Low, Chien T; Wong, Martin; Wong, Wing C; Chan, Ming H

    2009-01-01

    Background Falls are an issue of great public health concern. This study focuses on outdoor falls within an urban community in Hong Kong. Urban environmental hazards are often place-specific and dependent upon the built features, landscape characteristics, and habitual activities. Therefore, falls must be examined with respect to local situations. Results This paper uses spatial analysis methods to map fall occurrences and examine possible environmental attributes of falls in an urban community of Hong Kong. The Nearest neighbour hierarchical (Nnh) and Standard Deviational Ellipse (SDE) techniques can offer additional insights about the circumstances and environmental factors that contribute to falls. The results affirm the multi-factorial nature of falls at specific locations and for selected groups of the population. Conclusion The techniques to detect hot spots of falls yield meaningful results that enable the identification of high risk locations. The combined use of descriptive and spatial analyses can be beneficial to policy makers because different preventive measures can be devised based on the types of environmental risk factors identified. The analyses are also important preludes to establishing research hypotheses for more focused studies. PMID:19291326

  11. Effectiveness and risks of combining antipsychotic drugs with electroconvulsive treatment.

    PubMed

    Sanz-Fuentenebro, Francisco Javier; Vidal Navarro, Ignacio; Ballesteros Sanz, Daniel; Verdura Vizcaíno, Ernesto

    2011-01-01

    The simultaneous application of electroconvulsive therapy (ECT) and psychotropic drugs is based on sparse data. Despite this, and the restrictive approach of the Guidelines and Consensus is widespread in the usual care, it is widely practiced in routine clinical. We reviewed the results of search on the topic in MEDLINE, PsychINFO, EMBASE and Cochrane, and the main guidelines on the subject and analyzed for drug groups. Except some reservation with regard to classical MAOIs, antidepressants are safe and effective enhancers of the TEC. It is desirable to discontinuation of BZD whenever clinically possible before the course of ECT for risk of interference, if not possible will have to use proper technique to ensure effective incentives. It is advisable to stop or reduce the dose of lithium prior to ECT based on a cost-benefit analysis of the risk of relapse, if maintained will be adjusted lower levels and cognitive effects minimizing techniques. The combination with "classic" and "atypical" antipsychotics power positive clinical effects and the risk of combined use is low. The positive data are collected with clozapine and ECT-resistant psychosis, with little presence of effects of the decrease of seizure threshold by clozapine, and important effect of empowerment, but of limited duration. Although it is strictly necessary to identify situations in terms of drugs, patient and ECT technique, and care necessary to develop tests that provide methodologically sound data, the combined use of ECT and psychotropic drugs in general presents an acceptable risk level and efficacy data by encouraging empowerment. Copyright © 2010 SEP y SEPB. Published by Elsevier Espana. All rights reserved.

  12. Total coliform and Escherichia coli contamination in rural well water: analysis for passive surveillance.

    PubMed

    Invik, Jesse; Barkema, Herman W; Massolo, Alessandro; Neumann, Norman F; Checkley, Sylvia

    2017-10-01

    With increasing stress on our water resources and recent waterborne disease outbreaks, understanding the epidemiology of waterborne pathogens is crucial to build surveillance systems. The purpose of this study was to explore techniques for describing microbial water quality in rural drinking water wells, based on spatiotemporal analysis, time series analysis and relative risk mapping. Tests results for Escherichia coli and coliforms from private and small public well water samples, collected between 2004 and 2012 in Alberta, Canada, were used for the analysis. Overall, 14.6 and 1.5% of the wells were total coliform and E. coli-positive, respectively. Private well samples were more often total coliform or E. coli-positive compared with untreated public well samples. Using relative risk mapping we were able to identify areas of higher risk for bacterial contamination of groundwater in the province not previously identified. Incorporation of time series analysis demonstrated peak contamination occurring for E. coli in July and a later peak for total coliforms in September, suggesting a temporal dissociation between these indicators in terms of groundwater quality, and highlighting the potential need to increase monitoring during certain periods of the year.

  13. Front-line intraperitoneal versus intravenous chemotherapy in stage III-IV epithelial ovarian, tubal, and peritoneal cancer with minimal residual disease: a competing risk analysis.

    PubMed

    Chang, Yen-Hou; Li, Wai-Hou; Chang, Yi; Peng, Chia-Wen; Cheng, Ching-Hsuan; Chang, Wei-Pin; Chuang, Chi-Mu

    2016-03-17

    In the analysis of survival data for cancer patients, the problem of competing risks is often ignored. Competing risks have been recognized as a special case of time-to-event analysis. The conventional techniques for time-to-event analysis applied in the presence of competing risks often give biased or uninterpretable results. Using a prospectively collected administrative health care database in a single institution, we identified patients diagnosed with stage III or IV primary epithelial ovarian, tubal, and peritoneal cancers with minimal residual disease after primary cytoreductive surgery between 1995 and 2012. Here, we sought to evaluate whether intraperitoneal chemotherapy outperforms intravenous chemotherapy in the presence of competing risks. Unadjusted and multivariable subdistribution hazards models were applied to this database with two types of competing risks (cancer-specific mortality and other-cause mortality) coded to measure the relative effects of intraperitoneal chemotherapy. A total of 1263 patients were recruited as the initial cohort. After propensity score matching, 381 patients in each arm entered into final competing risk analysis. Cumulative incidence estimates for cancer-specific mortality were statistically significantly lower (p = 0.017, Gray test) in patients receiving intraperitoneal chemotherapy (5-year estimates, 34.5%; 95% confidence interval [CI], 29.5-39.6%, and 10-year estimates, 60.7%; 95% CI, 52.2-68.0%) versus intravenous chemotherapy (5-year estimates, 41.3%; 95% CI, 36.2-46.3%, and 10-year estimates, 67.5%, 95% CI, 61.6-72.7%). In subdistribution hazards analysis, for cancer-specific mortality, intraperitoneal chemotherapy outperforms intravenous chemotherapy (Subdistribution hazard ratio, 0.82; 95% CI, 0.70-0.96) after correcting other covariates. In conclusion, results from this comparative effectiveness study provide supportive evidence for previous published randomized trials that intraperitoneal chemotherapy outperforms intravenous chemotherapy even eliminating the confounding of competing risks. We suggest that implementation of competing risk analysis should be highly considered for the investigation of cancer patients who have medium to long-term follow-up period.

  14. Pros and cons of transcatheter aortic valve implantation (TAVI)

    PubMed Central

    Terré, Juan A.; George, Isaac

    2017-01-01

    Transcatheter aortic valve implantation (TAVI) or replacement (TAVR) was recently approved by the FDA for intermediate risk patients with severe aortic stenosis (AS). This technique was already worldwide adopted for inoperable and high-risk patients. Improved device technology, imaging analysis and operator expertise has reduced the initial worrisome higher complications rate associated with TAVR, making it comparable to surgical aortic valve replacement (SAVR). However, many answers need to be addressed before adoption in lower risk patients. This paper highlights the pros and cons of TAVI based mostly on randomized clinical trials involving the two device platforms approved in the United States. We focused our analysis on metrics that will play a key role in expanding TAVR indication in healthier individuals. We review the significance and gave a perspective on paravalvular leak (PVL), valve performance, valve durability, leaflet thrombosis, stroke and pacemaker requirement. PMID:29062739

  15. Safety analysis, risk assessment, and risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamali, K.; Stack, D.W.; Sullivan, L.H.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less

  16. Is non-cavitated proximal lesion sealing an effective method for caries control in primary and permanent teeth? A systematic review and meta-analysis.

    PubMed

    Ammari, Michelle Mikhael; Soviero, Vera Mendes; da Silva Fidalgo, Tatiana Kelly; Lenzi, Michele; Ferreira, Daniele Masterson T P; Mattos, Cláudia Trindade; de Souza, Ivete Pomarico Ribeiro; Maia, Lucianne Cople

    2014-10-01

    The aim of this study was to perform a systematic review and meta-analysis on the effectiveness of sealing non-cavitated proximal caries lesions in primary and permanent teeth. Only controlled clinical trials and randomized controlled clinical trials that evaluated the effectiveness of sealing on non-cavitated proximal caries with a minimum follow-up of 12 months were included in the study. The primary outcome should be arrestment/progression of proximal caries evaluated by bitewing radiographs. A risk of bias evaluation based on the Cochrane Collaboration common scheme for bias was carried out for each study. The meta-analysis was performed on the studies considered low risk of bias and with pair-wise visual reading results through RevMan software. A comprehensive search was performed in the Systematic Electronic Databases: Pubmed, Cochrane Library, Scopus, IBI Web of Science, Lilacs, SIGLE, and on website Clinical trials.gov, through until June 2013. From 967 studies identified, 10 articles and 3 studies with partial results were assessed for eligibility. However three articles were excluded and our final sample included 10 studies. According to the risk of bias evaluation, six studies were considered "high" risk of bias, and four "low" risk of bias. The forest plot of the meta-analysis showed low heterogeneity (I(2)=29%) and a favourable outcome for the Infiltrant. The chance of caries progression when this technique was used was significantly lower (p=0.002) compared with Placebo. Our results suggest that the technique of sealing non-cavitated proximal caries seems to be effective in controlling proximal caries in the short and medium term. Further long-term randomized clinical trials are still necessary to increase this evidence. Contemporary dentistry is focused in minimally invasive approaches that prevent the destruction of sound dental tissues next to carious lesions. This paper searches for evidence of the efficacy of sealing/infiltrating non-cavitated proximal caries in arresting caries progression both in permanent and primary teeth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The use of failure mode and effect analysis in a radiation oncology setting: the Cancer Treatment Centers of America experience.

    PubMed

    Denny, Diane S; Allen, Debra K; Worthington, Nicole; Gupta, Digant

    2014-01-01

    Delivering radiation therapy in an oncology setting is a high-risk process where system failures are more likely to occur because of increasing utilization, complexity, and sophistication of the equipment and related processes. Healthcare failure mode and effect analysis (FMEA) is a method used to proactively detect risks to the patient in a particular healthcare process and correct potential errors before adverse events occur. FMEA is a systematic, multidisciplinary team-based approach to error prevention and enhancing patient safety. We describe our experience of using FMEA as a prospective risk-management technique in radiation oncology at a national network of oncology hospitals in the United States, capitalizing not only on the use of a team-based tool but also creating momentum across a network of collaborative facilities seeking to learn from and share best practices with each other. The major steps of our analysis across 4 sites and collectively were: choosing the process and subprocesses to be studied, assembling a multidisciplinary team at each site responsible for conducting the hazard analysis, and developing and implementing actions related to our findings. We identified 5 areas of performance improvement for which risk-reducing actions were successfully implemented across our enterprise. © 2012 National Association for Healthcare Quality.

  18. Use of a systematic risk analysis method (FMECA) to improve quality in a clinical laboratory procedure.

    PubMed

    Serafini, A; Troiano, G; Franceschini, E; Calzoni, P; Nante, N; Scapellato, C

    2016-01-01

    Risk management is a set of actions to recognize or identify risks, errors and their consequences and to take the steps to counter it. The aim of our study was to apply FMECA (Failure Mode, Effects and Criticality Analysis) to the Activated Protein C resistance (APCR) test in order to detect and avoid mistakes in this process. We created a team and the process was divided in phases and sub phases. For each phase we calculated the probability of occurrence (O) of an error, the detectability score (D) and the severity (S). The product of these three indexes yields the RPN (Risk Priority Number). Phases with a higher RPN need corrective actions with a higher priority. The calculation of RPN showed that more than 20 activities have a score higher than 150 and need important preventive actions; 8 have a score between 100 and 150. Only 23 actions obtained an acceptable score lower than 100. This was one of the first experience of application of FMECA analysis to a laboratory process, and the first one which applies this technique to the identification of the factor V Leiden, and our results confirm that FMECA could be a simple, powerful and useful tool in risk management and helps to identify quickly the criticality in a laboratory process.

  19. Individual and couple-level risk factors for hepatitis C infection among heterosexual drug users: a multilevel dyadic analysis.

    PubMed

    McMahon, James M; Pouget, Enrique R; Tortu, Stephanie

    2007-06-01

    Hepatitis C virus (HCV) is the most common bloodborne pathogen in the United States and is a leading cause of liver-related morbidity and mortality. Although it is known that HCV is most commonly transmitted among injection drug users, the role of sexual transmission in the spread of HCV remains controversial because of inconsistent findings across studies involving heterosexual couples. A novel multilevel modeling technique designed to overcome the limitations of previous research was performed to assess multiple risk factors for HCV while partitioning the source of risk at the individual and couple level. The analysis was performed on risk exposure and HCV screening data obtained from 265 drug-using couples in East Harlem, New York City. In multivariable analysis, significant individual risk factors for HCV included a history of injection drug use, tattooing, and older age. At the couple level, HCV infection tended to cluster within couples, and this interdependence was accounted for by couples' drug-injection behavior. Individual and couple-level sexual behavior was not associated with HCV infection. Our results are consistent with prior research indicating that sexual contact plays little role in HCV transmission. Rather, couples' injection behavior appears to account for the clustering of HCV within heterosexual dyads.

  20. TH-E-19A-01: Quality and Safety in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, E; Ezzell, G; Miller, B

    2014-06-15

    Clinical radiotherapy data clearly demonstrate the link between the quality and safety of radiation treatments and the outcome for patients. The medical physicist plays an essential role in this process. To ensure the highest quality treatments, the medical physicist must understand and employ modern quality improvement techniques. This extends well beyond the duties traditionally associated with prescriptive QA measures. This session will review the current best practices for improving quality and safety in radiation therapy. General elements of quality management will be reviewed including: what makes a good quality management structure, the use of prospective risk analysis such as FMEA,more » and the use of incident learning. All of these practices are recommended in society-level documents and are incorporated into the new Practice Accreditation program developed by ASTRO. To be effective, however, these techniques must be practical in a resource-limited environment. This session will therefore focus on practical tools such as the newly-released radiation oncology incident learning system, RO-ILS, supported by AAPM and ASTRO. With these general constructs in mind, a case study will be presented of quality management in an SBRT service. An example FMEA risk assessment will be presented along with incident learning examples including root cause analysis. As the physicist's role as “quality officer” continues to evolve it will be essential to understand and employ the most effective techniques for quality improvement. This session will provide a concrete overview of the fundamentals in quality and safety. Learning Objectives: Recognize the essential elements of a good quality management system in radiotherapy. Understand the value of incident learning and the AAPM/ASTRO ROILS incident learning system. Appreciate failure mode and effects analysis as a risk assessment tool and its use in resource-limited environments. Understand the fundamental principles of good error proofing that extends beyond traditional prescriptive QA measures.« less

  1. Risk Perception of Pregnancy Promotes Disapproval of Gestational Surrogacy: Analysis of a Nationally Representative Opinion Survey in Japan

    PubMed Central

    Suzuki, Kohta; Sawa, Rintaro; Muto, Kaori; Kusuda, Satoshi; Banno, Kouji; Yamagata, Zentaro

    2011-01-01

    Background To clarify the relationship between the general attitude towards gestational surrogacy and risk perception about pregnancy and infertility treatment. Materials and Methods This study analysed the data of nationally representative cross-sectional surveys from 2007 concerning assisted reproductive technologies. The participants represented the general Japanese population. We used this data to carry out multivariate analysis. The main outcome measures were adjusted odds ratios and 95% confidence intervals from logistic regression models for factors including the effect of pregnancy risk perception on the attitude toward gestational surrogacy. Results In this survey, 3412 participants responded (response rate of 68.2%). With regard to the attitude towards gestational surrogacy, 54.0% of the respondents approved of it, and 29.7% stated that they were undecided. The perception of a high level of risk concerning ectopic pregnancy, threatened miscarriage or premature birth, and pregnancy-induced hypertension influenced the participants’ attitudes towards gestational surrogacy. Moreover, this perception of risk also contributed to a disapproval of the technique. Conclusion Our findings suggest that a person who understands the risks associated with pregnancy might clearly express their disapproval of gestational surrogacy. PMID:24963363

  2. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.

    PubMed

    Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L

    2008-06-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.

  3. Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures

    PubMed Central

    Peng, Chung-Kang; Goldberger, Ary L.

    2016-01-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763

  4. Reliability of diagnosis and clinical efficacy of visceral osteopathy: a systematic review.

    PubMed

    Guillaud, Albin; Darbois, Nelly; Monvoisin, Richard; Pinsault, Nicolas

    2018-02-17

    In 2010, the World Health Organization published benchmarks for training in osteopathy in which osteopathic visceral techniques are included. The purpose of this study was to identify and critically appraise the scientific literature concerning the reliability of diagnosis and the clinical efficacy of techniques used in visceral osteopathy. Databases MEDLINE, OSTMED.DR, the Cochrane Library, Osteopathic Research Web, Google Scholar, Journal of American Osteopathic Association (JAOA) website, International Journal of Osteopathic Medicine (IJOM) website, and the catalog of Académie d'ostéopathie de France website were searched through December 2017. Only inter-rater reliability studies including at least two raters or the intra-rater reliability studies including at least two assessments by the same rater were included. For efficacy studies, only randomized-controlled-trials (RCT) or crossover studies on unhealthy subjects (any condition, duration and outcome) were included. Risk of bias was determined using a modified version of the quality appraisal tool for studies of diagnostic reliability (QAREL) in reliability studies. For the efficacy studies, the Cochrane risk of bias tool was used to assess their methodological design. Two authors performed data extraction and analysis. Eight reliability studies and six efficacy studies were included. The analysis of reliability studies shows that the diagnostic techniques used in visceral osteopathy are unreliable. Regarding efficacy studies, the least biased study shows no significant difference for the main outcome. The main risks of bias found in the included studies were due to the absence of blinding of the examiners, an unsuitable statistical method or an absence of primary study outcome. The results of the systematic review lead us to conclude that well-conducted and sound evidence on the reliability and the efficacy of techniques in visceral osteopathy is absent. The review is registered PROSPERO 12th of December 2016. Registration number is CRD4201605286 .

  5. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  6. Robust Derivation of Risk Reduction Strategies

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Port, Daniel; Feather, Martin

    2007-01-01

    Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.

  7. Study of the role of tumor necrosis factor-α (-308 G/A) and interleukin-10 (-1082 G/A) polymorphisms as potential risk factors to acute kidney injury in patients with severe sepsis using high-resolution melting curve analysis.

    PubMed

    Hashad, Doaa I; Elsayed, Eman T; Helmy, Tamer A; Elawady, Samier M

    2017-11-01

    Septic acute kidney injury (AKI) is a prevalent complication in intensive care units with an increased incidence of complications. The aim of the present study was to assess the use of high-resolution melting curve (HRM) analysis in investigating whether the genetic polymorphisms; -308 G/A of tumor necrosis factor-α (TNF-α), and -1082 G /A of Interleukin-10 (IL-10) genes may predispose patients diagnosed with severe sepsis to the development of AKI. One hundred and fifty patients with severe sepsis participated in the present study; only sixty-six developed AKI. Both polymorphisms were studied using HRM analysis. The low producer genotype of both studied polymorphism of TNF-α and IL-10 genes was associated with AKI. Using logistic regression analysis, the low producer genotypes remained an independent risk factor for AKI. A statistically significant difference was detected between both studied groups as regards the low producer genotype in both TNF-α (-308 G/A) and interleukin-10 (IL-10) (-1082 G/A) polymorphisms being prevalent in patients developing AKI. Principle conclusions: The low producer genotypes of both TNF-α (-308 G/A) and IL-10 (-1082 G/A) polymorphisms could be considered a risk factor for the development of AKI in critically ill patients with severe sepsis, thus management technique implemented for this category should be modulated rescuing this sector of patients from the grave deterioration to acute kidney injury. Using HRM for genotyping proved to be a highly efficient, simple, cost-effective genotyping technique that is most appropriate for the routine study of large-scale samples.

  8. Failure Modes and Effects Analysis (FMEA): A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  9. The postgenomic era: implications for the clinical laboratory.

    PubMed

    Kiechle, Frederick L; Zhang, Xinbo

    2002-03-01

    To review the advances in clinically useful molecular biological techniques and to identify their applications in clinical practice, as presented at the Tenth Annual William Beaumont Hospital DNA Symposium. The 11 manuscripts submitted were reviewed and their major findings were compared with literature on the same topic. Manuscripts address creative thinking techniques applied to DNA discovery, extraction of DNA from clotted blood, the relationship of mitochondrial dysfunction in neurodegenerative disorders, and molecular methods to identify human lymphocyte antigen class I and class II loci. Two other manuscripts review current issues in molecular microbiology, including detection of hepatitis C virus and biological warfare. The last 5 manuscripts describe current issues in molecular cardiovascular disease, including assessing thrombotic risk, genomic analysis, gene therapy, and a device for aiding in cardiac angiogenesis. Novel problem-solving techniques have been used in the past and will be required in the future in DNA discovery. The extraction of DNA from clotted blood demonstrates a potential cost-effective strategy. Cybrids created from mitochondrial DNA-depleted cells and mitochondrial DNA from a platelet donor have been useful in defining the role mitochondria play in neurodegeneration. Mitochondrial depletion has been reported as a genetically inherited disorder or after human immunodeficiency virus therapy. Hepatitis C viral detection by qualitative, quantitative, or genotyping techniques is useful clinically. Preparedness for potential biological warfare is a responsibility of all clinical laboratorians. Thrombotic risk in cardiovascular disorders may be assessed by coagulation screening assays and further defined by mutation analysis for specific genes for prothrombin and factor V Leiden. Gene therapy for reducing arteriosclerotic risk has been hindered primarily by complications introduced by the vectors used to introduce the therapeutic genes. Neovascularization in cardiac muscle with occluded vessels represents a promising method for recovery of viable tissue following ischemia. The sequence of the human genome was reported by 2 groups in February 2001. The postgenomic era will emphasize the use of microarrays and database software for genomic and proteomic screening in the search for useful clinical assays. The number of molecular pathologic techniques and assays will expand as additional disease-associated mutations are defined. Gene therapy and tissue engineering will represent successful therapeutic adjuncts.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberkampf, William Louis; Tucker, W. Troy; Zhang, Jianzhong

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  11. Evaluation of Military Field-Water Quality. Volume 7. Performance Evaluation of the 600-GPH Reverse Osmosis Water Purification Unit (ROWPU): reverse Osmosis (RO) Components

    DTIC Science & Technology

    1986-02-01

    Data-for ASsessingHealthRISKS In Potential Theaters ~ of Operation forU,5. Mil-itary._orces. Jf-jr cLAss t:cAToN 9; r-.$ ’AGg UNCLASSIFIED Volume 7...Bypass, Potable-Water Disinfection,_ and Water-Quality Analysis Techniques; and Vol. 9, Data for Assessing Health Risks in Potential Theaters of Operation ...cleaning the RO elements, with objectives of improving solute rejection and reducing operating pressure. The most common method is to flush citric acid

  12. A framework for risk assessment and decision-making strategies in dangerous good transportation.

    PubMed

    Fabiano, B; Currò, F; Palazzi, E; Pastorino, R

    2002-07-01

    The risk from dangerous goods transport by road and strategies for selecting road load/routes are faced in this paper, by developing an original site-oriented framework of general applicability at local level. A realistic evaluation of the frequency must take into account on one side inherent factors (e.g. tunnels, rail bridges, bend radii, slope, characteristics of neighborhood, etc.) on the other side factors correlated to the traffic conditions (e.g. dangerous goods trucks, etc.). Field data were collected on the selected highway, by systematic investigation, providing input data for a database reporting tendencies and intrinsic parameter/site-oriented statistics. The developed technique was applied to a pilot area, considering both the individual risk and societal risk and making reference to flammable and explosive scenarios. In this way, a risk assessment, sensitive to route features and population exposed, is proposed, so that the overall uncertainties in risk analysis can be lowered.

  13. Patient positioning and ventilator-associated pneumonia.

    PubMed

    Hess, Dean R

    2005-07-01

    Rotational beds, prone position, and semi-recumbent position have been proposed as procedures to prevent ventilator-associated pneumonia (VAP). Rotational therapy uses a special bed designed to turn continuously, or nearly continuously, the patient from side to side; specific designs include kinetic therapy and continuous lateral rotation therapy. A meta-analysis of studies evaluating the effect of rotational bed therapy shows a decrease in the risk of pneumonia but no effect on mortality. Two studies reported a lower risk of VAP in patients placed in a prone position, with no effect on mortality. Studies using radiolabeled enteral feeding solutions in mechanically ventilated patients have reported that aspiration of gastric contents occurs to a greater degree when patients are in the supine position, compared with the semirecumbent position. One study reported a lower rate of VAP in patients randomized to semi-recumbent compared to supine position. Although each of the techniques discussed in this paper has been shown to reduce the risk of VAP, none has been shown to affect mortality. The available evidence suggests that semi-recumbent position should be used routinely, rotational therapy should be considered in selected patients, and prone position should not be used as a technique to reduce the risk of VAP.

  14. Flapless versus Conventional Flapped Dental Implant Surgery: A Meta-Analysis

    PubMed Central

    Chrcanovic, Bruno Ramos; Albrektsson, Tomas; Wennerberg, Ann

    2014-01-01

    The aim of this study was to test the null hypothesis of no difference in the implant failure rates, postoperative infection, and marginal bone loss for patients being rehabilitated by dental implants being inserted by a flapless surgical procedure versus the open flap technique, against the alternative hypothesis of a difference. An electronic search without time or language restrictions was undertaken in March 2014. Eligibility criteria included clinical human studies, either randomized or not. The search strategy resulted in 23 publications. The I2 statistic was used to express the percentage of the total variation across studies due to heterogeneity. The inverse variance method was used for random-effects model or fixed-effects model, when indicated. The estimates of relative effect were expressed in risk ratio (RR) and mean difference (MD) in millimeters. Sixteen studies were judged to be at high risk of bias, whereas two studies were considered of moderate risk of bias, and five studies of low risk of bias. The funnel plots indicated absence of publication bias for the three outcomes analyzed. The test for overall effect showed that the difference between the procedures (flapless vs. open flap surgery) significantly affect the implant failure rates (P = 0.03), with a RR of 1.75 (95% CI 1.07–2.86). However, a sensitivity analysis revealed differences when studies of high and low risk of bias were pooled separately. Thus, the results must be interpreted carefully. No apparent significant effects of flapless technique on the occurrence of postoperative infection (P = 0.96; RR 0.96, 95% CI 0.23–4.03) or on the marginal bone loss (P = 0.16; MD −0.07 mm, 95% CI −0.16–0.03) were observed. PMID:24950053

  15. Approach to the health-risk management on municipal reclaimed water reused in landscape water system

    NASA Astrophysics Data System (ADS)

    Liu, X.; Li, J.; Liu, W.

    2008-12-01

    Water pollution and water heavily shortage are both main environmental conflicts in China. Reclaimed water reuse is an important approach to lessen water pollution and solve the water shortage crisis in the city. The heath risk of reclaimed water has become the focus of the public. It is impending to evaluate the health risk of reclaimed water with risk assessment technique. Considering the ways of the reclaimed water reused, it is studied that health risk produced by toxic pollutants and pathogenic microbes in the processes of reclaimed water reused in landscape water system. The pathogenic microbes monitoring techniques in wastewater and reclaimed water are discussed and the hygienic indicators, risk assessment methods, concentration limitations of pathogenic microbes for various reclaimed water uses are studied. The principle of health risk assessment is used to research the exposure level and the health risk of concerned people in a wastewater reuse project where the reclaimed water is applied for green area irrigation in a public park in Beijing. The exposure assessment method and model of various reclaimed water uses are built combining with Beijing reclaimed water project. Firstly the daily ingesting dose and lifetime average daily dose(LADD) of exposure people are provided via field work and monitoring analysis, which could be used in health risk assessment as quantitative reference. The result shows that the main risk comes from the pathology pollutants, the toxic pollutants, the eutrophication pollutants, pathogenic microbes and the secondary pollutants when municipal wastewater is reclaimed for landscape water. The major water quality limited should include pathogenic microbes, toxic pollutants, and heavy metals. Keywords: municipal wastewater, reclaimed water, landscape water, health risk

  16. Potential application of quantitative microbiological risk assessment techniques to an aseptic-UHT process in the food industry.

    PubMed

    Pujol, Laure; Albert, Isabelle; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2013-04-01

    Aseptic ultra-high-temperature (UHT)-type processed food products (e.g., milk or soup) are ready to eat products which are consumed extensively globally due to a combination of their comparative high quality and long shelf life, with no cold chain or other preservation requirements. Due to the inherent microbial vulnerability of aseptic-UHT product formulations, the safety and stability-related performance objectives (POs) required at the end of the manufacturing process are the most demanding found in the food industry. The key determinants to achieving sterility, and which also differentiates aseptic-UHT from in-pack sterilised products, are the challenges associated with the processes of aseptic filling and sealing. This is a complex process that has traditionally been run using deterministic or empirical process settings. Quantifying the risk of microbial contamination and recontamination along the aseptic-UHT process, using the scientifically based process quantitative microbial risk assessment (QMRA), offers the possibility to improve on the currently tolerable sterility failure rate (i.e., 1 defect per 10,000 units). In addition, benefits of applying QMRA are (i) to implement process settings in a transparent and scientific manner; (ii) to develop a uniform common structure whatever the production line, leading to a harmonisation of these process settings, and; (iii) to bring elements of a cost-benefit analysis of the management measures. The objective of this article is to explore how QMRA techniques and risk management metrics may be applied to aseptic-UHT-type processed food products. In particular, the aseptic-UHT process should benefit from a number of novel mathematical and statistical concepts that have been developed in the field of QMRA. Probabilistic techniques such as Monte Carlo simulation, Bayesian inference and sensitivity analysis, should help in assessing the compliance with safety and stability-related POs set at the end of the manufacturing process. The understanding of aseptic-UHT process contamination will be extended beyond the current "as-low-as-reasonably-achievable" targets to a risk-based framework, through which current sterility performance and future process designs can be optimised. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Hazard Experience, Geophysical Vulnerability, and Flood Risk Perceptions in a Postdisaster City, the Case of New Orleans.

    PubMed

    Gotham, Kevin Fox; Campanella, Richard; Lauve-Moon, Katie; Powers, Bradford

    2018-02-01

    This article investigates the determinants of flood risk perceptions in New Orleans, Louisiana (United States), a deltaic coastal city highly vulnerable to seasonal nuisance flooding and hurricane-induced deluges and storm surges. Few studies have investigated the influence of hazard experience, geophysical vulnerability (hazard proximity), and risk perceptions in cities undergoing postdisaster recovery and rebuilding. We use ordinal logistic regression techniques to analyze experiential, geophysical, and sociodemographic variables derived from a survey of 384 residents in seven neighborhoods. We find that residents living in neighborhoods that flooded during Hurricane Katrina exhibit higher levels of perceived risk than those residents living in neighborhoods that did not flood. In addition, findings suggest that flood risk perception is positively associated with female gender, lower income, and direct flood experiences. In conclusion, we discuss the implications of these findings for theoretical and empirical research on environmental risk, flood risk communication strategies, and flood hazards planning. © 2017 Society for Risk Analysis.

  18. New developments in exposure assessment: the impact on the practice of health risk assessment and epidemiological studies.

    PubMed

    Nieuwenhuijsen, Mark; Paustenbach, Dennis; Duarte-Davidson, Raquel

    2006-12-01

    The field of exposure assessment has matured significantly over the past 10-15 years. Dozens of studies have measured the concentrations of numerous chemicals in many media to which humans are exposed. Others have catalogued the various exposure pathways and identified typical values which can be used in the exposure calculations for the general population such as amount of water or soil ingested per day or the percent of a chemical than can pass through the skin. In addition, studies of the duration of exposure for many tasks (e.g. showering, jogging, working in the office) have been conducted which allow for more general descriptions of the likely range of exposures. All of this information, as well as the development of new and better models (e.g. air dispersion or groundwater models), allow for better estimates of exposure. In addition to identifying better exposure factors, and better mathematical models for predicting the aerial distribution of chemicals, the conduct of simulation studies and dose-reconstruction studies can offer extraordinary opportunities for filling in data gaps regarding historical exposures which are critical to improving the power of epidemiology studies. The use of probabilistic techniques such as Monte Carlo analysis and Bayesian statistics have revolutionized the practice of exposure assessment and has greatly enhanced the quality of the risk characterization. Lastly, the field of epidemiology is about to undergo a sea change with respect to the exposure component because each year better environmental and exposure models, statistical techniques and new biological monitoring techniques are being introduced. This paper reviews these techniques and discusses where additional research is likely to pay a significant dividend. Exposure assessment techniques are now available which can significantly improve the quality of epidemiology and health risk assessment studies and vastly improve their usefulness. As more quantitative exposure components can now be incorporated into these studies, they can be better used to identify safe levels of exposure using customary risk assessment methodologies. Examples are drawn from both environmental and occupational studies illustrating how these techniques have been used to better understand exposure to specific chemicals. Some thoughts are also presented on what lessons have been learned about conducting exposure assessment for health risk assessments and epidemiological studies.

  19. Spatial assessment of Geo-environmental data by the integration of Remote Sensing and GIS techniques for Sitakund Region, Eastern foldbelt, Bangladesh.

    NASA Astrophysics Data System (ADS)

    Gazi, M. Y.; Rahman, M.; Islam, M. A.; Kabir, S. M. M.

    2016-12-01

    Techniques of remote sensing and geographic information systems (GIS) have been applied for the analysis and interpretation of the Geo-environmental assessment to Sitakund area, located within the administrative boundaries of the Chittagong district, Bangladesh. Landsat ETM+ image with a ground resolution of 30-meter and Digital Elevation Model (DEM) has been adopted in this study in order to produce a set of thematic maps. The diversity of the terrain characteristics had a major role in the diversity of recipes and types of soils that are based on the geological structure, also helped to diversity in land cover and use in the region. The geological situation has affected on the general landscape of the study area. The problem of research lies in the possibility of the estimating the techniques of remote sensing and geographic information systems in the evaluation of the natural data for the study area spatially as well as determine the appropriate in grades for the appearance of the ground and in line with the reality of the region. Software for remote sensing and geographic information systems were adopted in the analysis, classification and interpretation of the prepared thematic maps in order to get to the building of the Geo-environmental assessment map of the study area. Low risk geo-environmental land mostly covered area of Quaternary deposits especially with area of slope wash deposits carried by streams. Medium and high risk geo-environmental land distributed with area of other formation with the study area, mostly the high risk shows area of folds and faults. The study has assessed the suitability of lands for agricultural purpose and settlements in less vulnerable areas within this region.

  20. Reducing the Risk of Human Space Missions with INTEGRITY

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.

    2003-01-01

    The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.

  1. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  2. Hazard Analysis and Safety Requirements for Small Drone Operations: To What Extent Do Popular Drones Embed Safety?

    PubMed

    Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela

    2018-03-01

    Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.

  3. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  4. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. USING BIOASSAYS TO EVALUATE THE PERFORMANCE OF RISK MANAGEMENT TECHNIQUES

    EPA Science Inventory

    Often, the performance of risk management techniques is evaluated by measuring the concentrations of the chemials of concern before and after risk management effoprts. However, using bioassays and chemical data provides a more robust understanding of the effectiveness of risk man...

  6. Combining Cluster Analysis and Small Unmanned Aerial Systems (sUAS) for Accurate and Low-cost Bathymetric Surveying

    NASA Astrophysics Data System (ADS)

    Maples, B. L.; Alvarez, L. V.; Moreno, H. A.; Chilson, P. B.; Segales, A.

    2017-12-01

    Given that classical in-situ direct surveying for geomorphological subsurface information in rivers is time-consuming, labor-intensive, costly, and often involves high-risk activities, it is obvious that non-intrusive technologies, like UAS-based, LIDAR-based remote sensing, have a promising potential and benefits in terms of efficient and accurate measurement of channel topography over large areas within a short time; therefore, a tremendous amount of attention has been paid to the development of these techniques. Over the past two decades, efforts have been undertaken to develop a specialized technique that can penetrate the water body and detect the channel bed to derive river and coastal bathymetry. In this research, we develop a low-cost effective technique for water body bathymetry. With the use of a sUAS and a light-weight sonar, the bathymetry and volume of a small reservoir have been surveyed. The sUAS surveying approach is conducted under low altitudes (2 meters from the water) using the sUAS to tow a small boat with the sonar attached. A cluster analysis is conducted to optimize the sUAS data collection and minimize the standard deviation created by under-sampling in areas of highly variable bathymetry, so measurements are densified in regions featured by steep slopes and drastic changes in the reservoir bed. This technique provides flexibility, efficiency, and free-risk to humans while obtaining high-quality information. The irregularly-spaced bathymetric survey is then interpolated using unstructured Triangular Irregular Network (TIN)-based maps to avoid re-gridding or re-sampling issues.

  7. Cardiovascular risk from water arsenic exposure in Vietnam: Application of systematic review and meta-regression analysis in chemical health risk assessment.

    PubMed

    Phung, Dung; Connell, Des; Rutherford, Shannon; Chu, Cordia

    2017-06-01

    A systematic review (SR) and meta-analysis cannot provide the endpoint answer for a chemical risk assessment (CRA). The objective of this study was to apply SR and meta-regression (MR) analysis to address this limitation using a case study in cardiovascular risk from arsenic exposure in Vietnam. Published studies were searched from PubMed using the keywords of arsenic exposure and cardiovascular diseases (CVD). Random-effects meta-regression was applied to model the linear relationship between arsenic concentration in water and risk of CVD, and then the no-observable-adverse-effect level (NOAEL) were identified from the regression function. The probabilistic risk assessment (PRA) technique was applied to characterize risk of CVD due to arsenic exposure by estimating the overlapping coefficient between dose-response and exposure distribution curves. The risks were evaluated for groundwater, treated and drinking water. A total of 8 high quality studies for dose-response and 12 studies for exposure data were included for final analyses. The results of MR suggested a NOAEL of 50 μg/L and a guideline of 5 μg/L for arsenic in water which valued as a half of NOAEL and guidelines recommended from previous studies and authorities. The results of PRA indicated that the observed exposure level with exceeding CVD risk was 52% for groundwater, 24% for treated water, and 10% for drinking water in Vietnam, respectively. The study found that systematic review and meta-regression can be considered as an ideal method to chemical risk assessment due to its advantages to bring the answer for the endpoint question of a CRA. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Monitoring of human brain functions in risk decision-making task by diffuse optical tomography using voxel-wise general linear model

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Jing; Li, Lin; Cazzell, Marry; Liu, Hanli

    2013-03-01

    Functional near-infrared spectroscopy (fNIRS) is a non-invasive imaging technique which measures the hemodynamic changes that reflect the brain activity. Diffuse optical tomography (DOT), a variant of fNIRS with multi-channel NIRS measurements, has demonstrated capability of three dimensional (3D) reconstructions of hemodynamic changes due to the brain activity. Conventional method of DOT image analysis to define the brain activation is based upon the paired t-test between two different states, such as resting-state versus task-state. However, it has limitation because the selection of activation and post-activation period is relatively subjective. General linear model (GLM) based analysis can overcome this limitation. In this study, we combine the 3D DOT image reconstruction with GLM-based analysis (i.e., voxel-wise GLM analysis) to investigate the brain activity that is associated with the risk-decision making process. Risk decision-making is an important cognitive process and thus is an essential topic in the field of neuroscience. The balloon analogue risk task (BART) is a valid experimental model and has been commonly used in behavioral measures to assess human risk taking action and tendency while facing risks. We have utilized the BART paradigm with a blocked design to investigate brain activations in the prefrontal and frontal cortical areas during decision-making. Voxel-wise GLM analysis was performed on 18human participants (10 males and 8females).In this work, we wish to demonstrate the feasibility of using voxel-wise GLM analysis to image and study cognitive functions in response to risk decision making by DOT. Results have shown significant changes in the dorsal lateral prefrontal cortex (DLPFC) during the active choice mode and a different hemodynamic pattern between genders, which are in good agreements with published literatures in functional magnetic resonance imaging (fMRI) and fNIRS studies.

  9. Anxiety Disorders in Children and Adolescents with Autistic Spectrum Disorders: A Meta-Analysis

    ERIC Educational Resources Information Center

    van Steensel, Francisca J. A.; Bogels, Susan M.; Perrin, Sean

    2011-01-01

    There is considerable evidence that children and adolescents with autistic spectrum disorders (ASD) are at increased risk of anxiety and anxiety disorders. However, it is less clear which of the specific DSM-IV anxiety disorders occur most in this population. The present study used meta-analytic techniques to help clarify this issue. A systematic…

  10. Information support of monitoring of technical condition of buildings in construction risk area

    NASA Astrophysics Data System (ADS)

    Skachkova, M. E.; Lepihina, O. Y.; Ignatova, V. V.

    2018-05-01

    The paper presents the results of the research devoted to the development of a model of information support of monitoring buildings technical condition; these buildings are located in the construction risk area. As a result of the visual and instrumental survey, as well as the analysis of existing approaches and techniques, attributive and cartographic databases have been created. These databases allow monitoring defects and damages of buildings located in a 30-meter risk area from the object under construction. The classification of structures and defects of these buildings under survey is presented. The functional capabilities of the developed model and the field of it practical applications are determined.

  11. Functional-analytical capabilities of GIS technology in the study of water use risks

    NASA Astrophysics Data System (ADS)

    Nevidimova, O. G.; Yankovich, E. P.; Yankovich, K. S.

    2015-02-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved.

  12. Mapping ecological risks with a portfolio-based technique: incorporating uncertainty and decision-making preferences

    Treesearch

    Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler

    2013-01-01

    Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...

  13. Risk Factors for Infection After Rotator Cuff Repair.

    PubMed

    Vopat, Bryan G; Lee, Bea J; DeStefano, Sherilyn; Waryasz, Gregory R; Kane, Patrick M; Gallacher, Stacey E; Fava, Joseph; Green, Andrew G

    2016-03-01

    To identify risk factors for infection after rotator cuff repair. We hypothesized that patient characteristics and surgical technique would affect the rate of infection. The records of 1,824 rotator cuff repairs performed by a single surgeon from 1995 to 2010 were reviewed retrospectively. Fourteen patients had an early deep postoperative wound infection that was treated with surgical irrigation and debridement. One hundred eighty-five control patients who were treated with rotator cuff repair and did not develop an infection were selected randomly for comparison and statistical analysis. Data regarding preoperative and intraoperative risk factors for infection were recorded, and a multiple logistic regression was conducted to investigate predictors of infection. The infection rate was 0.77% (14/1,822). On average 2.1 (range 1 to 4) surgical debridements were performed in addition to treatment with intravenous antibiotics. Patients who had open or miniopen rotator cuff repair had a significantly greater risk of acute postoperative infection (odds ratio [OR] = 8.63, P = .002). Seventy-nine percent of the patients in the infection group had an open or miniopen repair, whereas only 28% of the control group had an open or miniopen repair. Male patients also had a significantly greater risk of acute postoperative infection (OR = 9.52, P = .042). A total of 92% of the infection patients were male compared with 58% of the control group. In addition, as body mass index increased there was a reduction in the odds of infection (OR = 0.81, P = .023). The results of this case control study demonstrate that open or miniopen surgical technique and male sex are significant risk factors for infection after rotator cuff repair. In our study, arthroscopic rotator cuff repair reduced the risk of infection compared with open techniques. Level IV. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  14. Tensions in perspectives on suicide prevention between men who have attempted suicide and their support networks: Secondary analysis of qualitative data.

    PubMed

    Fogarty, Andrea S; Spurrier, Michael; Player, Michael J; Wilhelm, Kay; Whittle, Erin L; Shand, Fiona; Christensen, Helen; Proudfoot, Judith

    2018-02-01

    Men generally have higher rates of suicide, despite fewer overt indicators of risk. Differences in presentation and response suggest a need to better understand why suicide prevention is less effective for men. To explore the views of at-risk men, friends and family about the tensions inherent in suicide prevention and to consider how prevention may be improved. Secondary analysis of qualitative interview and focus group data, using thematic analysis techniques, alongside bracketing, construction and contextualisation. A total of 35 men who had recently made a suicide attempt participated in interviews, and 47 family and friends of men who had made a suicide attempt took part in focus groups. Participants recounted their experiences with men's suicide attempts and associated interventions, and suggested ways in which suicide prevention may be improved. Five tensions in perspectives emerged between men and their support networks, which complicated effective management of suicide risk: (i) respecting privacy vs monitoring risk, (ii) differentiating normal vs risky behaviour changes, (iii) familiarity vs anonymity in personal information disclosure, (iv) maintaining autonomy vs imposing constraints to limit risk, and (v) perceived need for vs failures of external support services. Tension between the different perspectives increased systemic stress, compounding problems and risk, thereby decreasing the effectiveness of detection of and interventions for men at risk of suicide. Suggested solutions included improving risk communication, reducing reliance on single source supports and increasing intervention flexibility in response to individual needs. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  15. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    PubMed

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-06-01

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p < 0.001]. No difference was seen in time to cannulate [SMD (95% CI) -0.31 (-0.65, 0.04); p = 0.30] and mean number of attempt [MD (95% CI) -0.65 (-1.32, 0.02); p = 0.06] between USG guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Microfluidics for Single-Cell Genetic Analysis

    PubMed Central

    Thompson, A. M.; Paguirigan, A. L.; Kreutz, J. E.; Radich, J. P.; Chiu, D. T.

    2014-01-01

    The ability to correlate single-cell genetic information to cellular phenotypes will provide the kind of detailed insight into human physiology and disease pathways that is not possible to infer from bulk cell analysis. Microfluidic technologies are attractive for single-cell manipulation due to precise handling and low risk of contamination. Additionally, microfluidic single-cell techniques can allow for high-throughput and detailed genetic analyses that increase accuracy and decreases reagent cost compared to bulk techniques. Incorporating these microfluidic platforms into research and clinical laboratory workflows can fill an unmet need in biology, delivering the highly accurate, highly informative data necessary to develop new therapies and monitor patient outcomes. In this perspective, we describe the current and potential future uses of microfluidics at all stages of single-cell genetic analysis, including cell enrichment and capture, single-cell compartmentalization and manipulation, and detection and analyses. PMID:24789374

  18. Polynuclear aromatic hydrocarbon analysis using the synchronous scanning luminoscope

    NASA Astrophysics Data System (ADS)

    Hyfantis, George J., Jr.; Teglas, Matthew S.; Wilbourn, Robert G.

    2001-02-01

    12 The Synchronous Scanning Luminoscope (SSL) is a field- portable, synchronous luminescence spectrofluorometer that was developed for on-site analysis of contaminated soil and ground water. The SSL is capable of quantitative analysis of total polynuclear aromatic hydrocarbons (PAHs) using phosphorescence and fluorescence techniques with a high correlation to laboratory data as illustrated by this study. The SSL is also capable of generating benzo(a)pyrene equivalency results, based on seven carcinogenic PAHs and Navy risk numbers, with a high correlation to laboratory data as illustrated by this study. These techniques allow rapid field assessments of total PAHs and benzo(a)pyrene equivalent concentrations. The Luminoscope is capable of detecting total PAHs to the parts per billion range. This paper describes standard field methods for using the SSL and describes the results of field/laboratory testing of PAHs. SSL results from two different hazardous waste sites are discussed.

  19. Early phase drug discovery: cheminformatics and computational techniques in identifying lead series.

    PubMed

    Duffy, Bryan C; Zhu, Lei; Decornez, Hélène; Kitchen, Douglas B

    2012-09-15

    Early drug discovery processes rely on hit finding procedures followed by extensive experimental confirmation in order to select high priority hit series which then undergo further scrutiny in hit-to-lead studies. The experimental cost and the risk associated with poor selection of lead series can be greatly reduced by the use of many different computational and cheminformatic techniques to sort and prioritize compounds. We describe the steps in typical hit identification and hit-to-lead programs and then describe how cheminformatic analysis assists this process. In particular, scaffold analysis, clustering and property calculations assist in the design of high-throughput screening libraries, the early analysis of hits and then organizing compounds into series for their progression from hits to leads. Additionally, these computational tools can be used in virtual screening to design hit-finding libraries and as procedures to help with early SAR exploration. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Radiation-induced second primary cancer risks from modern external beam radiotherapy for early prostate cancer: impact of stereotactic ablative radiotherapy (SABR), volumetric modulated arc therapy (VMAT) and flattening filter free (FFF) radiotherapy

    NASA Astrophysics Data System (ADS)

    Murray, Louise J.; Thompson, Christopher M.; Lilley, John; Cosgrove, Vivian; Franks, Kevin; Sebag-Montefiore, David; Henry, Ann M.

    2015-02-01

    Risks of radiation-induced second primary cancer following prostate radiotherapy using 3D-conformal radiotherapy (3D-CRT), intensity-modulated radiotherapy (IMRT), volumetric modulated arc therapy (VMAT), flattening filter free (FFF) and stereotactic ablative radiotherapy (SABR) were evaluated. Prostate plans were created using 10 MV 3D-CRT (78 Gy in 39 fractions) and 6 MV 5-field IMRT (78 Gy in 39 fractions), VMAT (78 Gy in 39 fractions, with standard flattened and energy-matched FFF beams) and SABR (42.7 Gy in 7 fractions with standard flattened and energy-matched FFF beams). Dose-volume histograms from pelvic planning CT scans of three prostate patients, each planned using all 6 techniques, were used to calculate organ equivalent doses (OED) and excess absolute risks (EAR) of second rectal and bladder cancers, and pelvic bone and soft tissue sarcomas, using mechanistic, bell-shaped and plateau models. For organs distant to the treatment field, chamber measurements recorded in an anthropomorphic phantom were used to calculate OEDs and EARs using a linear model. Ratios of OED give relative radiation-induced second cancer risks. SABR resulted in lower second cancer risks at all sites relative to 3D-CRT. FFF resulted in lower second cancer risks in out-of-field tissues relative to equivalent flattened techniques, with increasing impact in organs at greater distances from the field. For example, FFF reduced second cancer risk by up to 20% in the stomach and up to 56% in the brain, relative to the equivalent flattened technique. Relative to 10 MV 3D-CRT, 6 MV IMRT or VMAT with flattening filter increased second cancer risks in several out-of-field organs, by up to 26% and 55%, respectively. For all techniques, EARs were consistently low. The observed large relative differences between techniques, in absolute terms, were very low, highlighting the importance of considering absolute risks alongside the corresponding relative risks, since when absolute risks are very low, large relative risks become less meaningful. A calculated relative radiation-induced second cancer risk benefit from SABR and FFF techniques was theoretically predicted, although absolute radiation-induced second cancer risks were low for all techniques, and absolute differences between techniques were small.

  1. Consumption of different types of meat and the risk of renal cancer: meta-analysis of case-control studies.

    PubMed

    Faramawi, Mohammed F; Johnson, Eric; Fry, M Whitney; Sall, Macodu; Zhou, Yi; Yi, Zhou

    2007-03-01

    Kidney cancers account for almost 2% of all cancers worldwide, with 150,000 new cases and 78,000 deaths from the disease occurring annually. An increase in the incidence of kidney neoplasm in western countries was noticed in the past few years. Between 1988 and 1992, the incidence of renal cancer per 100,000 person-year among males in USA, Norway, and France was 34.1, 9.00, and 16.10, respectively. Among females in the same countries, it was 5.70, 5.00, and 7.30, respectively. Although several individual case-control studies examined the association of meat intake and renal cancer risk, the results were inconsistent because of the insufficient statistical power of the individual studies. Therefore, the following meta-analysis was designed to help in clarifying the association. Electronic search of MEDLINE, OVID, and PUBMED databases which have articles published between (1966 and 2006) was conducted to select studies for this meta-analysis. Fixed and random-effects meta-analytical techniques were used to estimate the overall association between meat consumption and kidney cancer. Thirteen case-control studies were found. This meta-analysis supported a positive relationship between meat consumption and risk of renal cancer. Summary results indicated that there was from 20% to 22% higher risk of renal cancer among those in the highest relative to the lowest category of poultry and processed meat consumption. Consumption of all meat and red meat was associated with 27% and 30% higher risk, respectively. The increased risks were statistically significant. Increased consumption of all meat, red meat, poultry, and processed meat is associated with an increase risk of kidney cancer. Reduction of meat consumption is an important approach to decreasing the incidence of kidney cancer in the general population.

  2. Endogenous fluorescence emission of the ovary

    NASA Astrophysics Data System (ADS)

    Utzinger, Urs; Kirkpatrick, Nathaniel D.; Drezek, Rebekah A.; Brewer, Molly A.

    2005-03-01

    Epithelial ovarian cancer has the highest mortality rate among the gynecologic cancers. Early detection would significantly improve survival and quality of life of women at increased risk to develop ovarian cancer. We have constructed a device to investigate endogenous signals of the ovarian tissue surface in the UV C to visible range and describe our initial investigation of the use of optical spectroscopy to characterize the condition of the ovary. We have acquired data from more than 33 patients. A table top spectroscopy system was used to collect endogenous fluorescence with a fiberoptic probe that is compatible with endoscopic techniques. Samples were broken into five groups: Normal-Low Risk (for developing ovarian cancer) Normal-High Risk, Benign, and Cancer. Rigorous statistical analysis was applied to the data using variance tests for direct intensity versus diagnostic group comparisons and principal component analysis (PCA) to study the variance of the whole data set. We conclude that the diagnostically most useful excitation wavelengths are located in the UV. Furthermore, our results indicate that UV B and C are most useful. A safety analysis indicates that UV-C imaging can be conducted at exposure levels below safety thresholds. We found that fluorescence excited in the UV-C and UV-B range increases from benign to normal to cancerous tissues. This is in contrast to the emission created with UV-A excitation which decreased in the same order. We hypothesize that an increase of protein production and a decrease of fluorescence contributions of the extracellular matrix could explain this behavior. Variance analysis also identified fluctuation of fluorescence at 320/380 which is associated with collagen cross link residues. Small differences were observed between the group at high risk and normal risk for ovarian cancer. High risk samples deviated towards the cancer group and low risk samples towards benign group.

  3. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  4. Prospect theory in the valuation of health.

    PubMed

    Moffett, Maurice L; Suarez-Almazor, Maria E

    2005-08-01

    Prospect theory is the prominent nonexpected utility theory in the estimation of health state preference scores for quality-adjusted life year calculation. Until recently, the theory was not considered to be developed to the point of implementation in economic analysis. This review focuses on the research and evidence that tests the implementation of prospect theory into health state valuation. The typical application of expected utility theory assumes that a decision maker has stable preferences under conditions of risk and uncertainty. Under prospect theory, preferences are dependent on whether the decision maker regards the outcome of a choice as a gain or loss, relative to a reference point. The conceptual preference for standard gamble utilities in the valuation of health states has led to the development of elicitation techniques. Empirical evidence using these techniques indicates that when individual preferences are elicited, a prospect theory consistent framework appears to be necessary for adequate representation of individual health utilities. The relevance of prospect theory to policy making and resource allocation remains to be established. Societal preferences may not need the same attitudes towards risks as individual preferences, and may remain largely risk neutral.

  5. Failure Mode and Effects Analysis: views of hospital staff in the UK.

    PubMed

    Shebl, Nada; Franklin, Bryony; Barber, Nick; Burnett, Susan; Parand, Anam

    2012-01-01

    To explore health care professionals' experiences and perceptions of Failure Mode and Effects Analysis (FMEA), a team-based, prospective risk analysis technique. Semi-structured interviews were conducted with 21 operational leads (20 pharmacists, one nurse) in medicines management teams of hospitals participating in a national quality improvement programme. Interviews were transcribed, coded and emergent themes identified using framework analysis. Themes identified included perceptions and experiences of participants with FMEA, validity and reliability issues, and FMEA's use in practice. FMEA was considered to be a structured but subjective process that helps health care professionals get together to identify high risk areas of care. Both positive and negative opinions were expressed, with the majority of interviewees expressing positive views towards FMEA in relation to its structured nature and the use of a multidisciplinary team. Other participants criticised FMEA for being subjective and lacking validity. Most likely to restrict its widespread use were its time consuming nature and its perceived lack of validity and reliability. FMEA is a subjective but systematic tool that helps identify high risk areas, but its time consuming nature, difficulty with the scores and perceived lack of validity and reliability may limit its widespread use.

  6. Analysis Methodologies and Ameliorative Techniques for Mitigation of the Risk in Churches with Drum Domes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zingone, Gaetano; Licata, Vincenzo; Calogero, Cucchiara

    2008-07-08

    The present work fits into the interesting theme of seismic prevention for protection of the monumental patrimony made up of churches with drum domes. Specifically, with respect to a church in the historic area of Catania, chosen as a monument exemplifying the typology examined, the seismic behavior is analyzed in the linear field using modern dynamic identification techniques. The dynamically identified computational model arrived at made it possible to identify the macro-element most at risk, the dome-drum system. With respect to this system the behavior in the nonlinear field is analyzed through dynamic tests on large-scale models in the presencemore » of various types of improving reinforcement. The results are used to appraise the ameliorative contribution afforded by each of them and to choose the most suitable type of reinforcement, optimizing the stiffness/ductility ratio of the system.« less

  7. Method of Evaluating the Life Cycle Cost of Small Earth Dams Considering the Risk of Heavy Rainfall and Selection Method of the Optimum Countermeasure

    NASA Astrophysics Data System (ADS)

    Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru

    In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.

  8. An analysis of spatial and socio-economic determinants of tuberculosis in Hermosillo, Mexico, 2000-2006.

    PubMed

    Alvarez-Hernández, G; Lara-Valencia, F; Reyes-Castro, P A; Rascón-Pacheco, R A

    2010-06-01

    The city of Hermosillo, in Northwest Mexico, has a higher incidence of tuberculosis (TB) than the national average. However, the intra-urban TB distribution, which could limit the effectiveness of preventive strategies and control, is unknown. Using geographic information systems (GIS) and spatial analysis, we characterized the geographical distribution of TB by basic geostatistical area (BGA), and compared it with a social deprivation index. Univariate and bivariate techniques were used to detect risk areas. Globally, TB in the city of Hermosillo is not spatially auto-correlated, but local clusters with high incidence and mortality rates were identified in the northwest, central-east and southwest sections of the city. BGAs with high social deprivation had an excess risk of TB. GIS and spatial analysis are useful tools to detect high TB risk areas in the city of Hermosillo. Such areas may be vulnerable due to low socio-economic status. The study of small geographical areas in urban settings similar to Hermosillo could indicate the best course of action to be taken for TB prevention and control.

  9. Assessing risk factors for periodontitis using regression

    NASA Astrophysics Data System (ADS)

    Lobo Pereira, J. A.; Ferreira, Maria Cristina; Oliveira, Teresa

    2013-10-01

    Multivariate statistical analysis is indispensable to assess the associations and interactions between different factors and the risk of periodontitis. Among others, regression analysis is a statistical technique widely used in healthcare to investigate and model the relationship between variables. In our work we study the impact of socio-demographic, medical and behavioral factors on periodontal health. Using regression, linear and logistic models, we can assess the relevance, as risk factors for periodontitis disease, of the following independent variables (IVs): Age, Gender, Diabetic Status, Education, Smoking status and Plaque Index. The multiple linear regression analysis model was built to evaluate the influence of IVs on mean Attachment Loss (AL). Thus, the regression coefficients along with respective p-values will be obtained as well as the respective p-values from the significance tests. The classification of a case (individual) adopted in the logistic model was the extent of the destruction of periodontal tissues defined by an Attachment Loss greater than or equal to 4 mm in 25% (AL≥4mm/≥25%) of sites surveyed. The association measures include the Odds Ratios together with the correspondent 95% confidence intervals.

  10. Assessing Suicide Risk Among Callers to Crisis Hotlines: A Confirmatory Factor Analysis

    PubMed Central

    Witte, Tracy K.; Gould, Madelyn S.; Munfakh, Jimmie Lou Harris; Kleinman, Marjorie; Joiner, Thomas E.; Kalafat, John

    2012-01-01

    Our goal was to investigate the factor structure of a risk assessment tool utilized by suicide hotlines and to determine the predictive validity of the obtained factors in predicting subsequent suicidal behavior. 1,085 suicidal callers to crisis hotlines were divided into three sub-samples, which allowed us to conduct an independent Exploratory Factor Analysis (EFA), EFA in a Confirmatory Factor Analysis (EFA/CFA) framework, and CFA. Similar to previous factor analytic studies (Beck et al., 1997; Holden & DeLisle, 2005; Joiner, Rudd, & Rajab, 1997; Witte et al., 2006), we found consistent evidence for a two-factor solution, with one factor representing a more pernicious form of suicide risk (i.e., Resolved Plans and Preparations) and one factor representing more mild suicidal ideation (i.e., Suicidal Desire and Ideation). Using structural equation modeling techniques, we found preliminary evidence that the Resolved Plans and Preparations factor trended toward being more predictive of suicidal ideation than the Suicidal Desire and Ideation factor. This factor analytic study is the first longitudinal study of the obtained factors. PMID:20578186

  11. Microfluidic Devices for Forensic DNA Analysis: A Review.

    PubMed

    Bruijns, Brigitte; van Asten, Arian; Tiggelaar, Roald; Gardeniers, Han

    2016-08-05

    Microfluidic devices may offer various advantages for forensic DNA analysis, such as reduced risk of contamination, shorter analysis time and direct application at the crime scene. Microfluidic chip technology has already proven to be functional and effective within medical applications, such as for point-of-care use. In the forensic field, one may expect microfluidic technology to become particularly relevant for the analysis of biological traces containing human DNA. This would require a number of consecutive steps, including sample work up, DNA amplification and detection, as well as secure storage of the sample. This article provides an extensive overview of microfluidic devices for cell lysis, DNA extraction and purification, DNA amplification and detection and analysis techniques for DNA. Topics to be discussed are polymerase chain reaction (PCR) on-chip, digital PCR (dPCR), isothermal amplification on-chip, chip materials, integrated devices and commercially available techniques. A critical overview of the opportunities and challenges of the use of chips is discussed, and developments made in forensic DNA analysis over the past 10-20 years with microfluidic systems are described. Areas in which further research is needed are indicated in a future outlook.

  12. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy.

    PubMed

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-12-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.

  13. Multielemental speciation analysis by advanced hyphenated technique - HPLC/ICP-MS: A review.

    PubMed

    Marcinkowska, Monika; Barałkiewicz, Danuta

    2016-12-01

    Speciation analysis has become an invaluable tool in human health risk assessment, environmental monitoring or food quality control. Another step is to develop reliable multielemental speciation methodologies, to reduce costs, waste and time needed for the analysis. Separation and detection of species of several elements in a single analytical run can be accomplished by high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-MS). Our review assembles articles concerning multielemental speciation determination of: As, Se, Cr, Sb, I, Br, Pb, Hg, V, Mo, Te, Tl, Cd and W in environmental, biological, food and clinical samples analyzed with HPLC/ICP-MS. It addresses the procedures in terms of following issues: sample collection and pretreatment, selection of optimal conditions for elements species separation by HPLC and determination using ICP-MS as well as metrological approach. The presented work is the first review article concerning multielemental speciation analysis by advanced hyphenated technique HPLC/ICP-MS. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. The Risk of Sexually Transmitted Infection and Its Influence on Condom Use among Pregnant Women in the Kintampo North Municipality of Ghana

    PubMed Central

    Afari-Asiedu, Samuel; Gyabaa-Febir, Lawrence; Adjei, Kwame Kesse; Mahama, Emmanuel; Tawiah-Agyemang, Charlotte; Newton, Sam K.; Asante, Kwaku Poku; Owusu-Agyei, Seth

    2017-01-01

    Sexually transmitted infection (STI) affects the reproductive health of both men and women worldwide. Condoms are important part of the available preventive strategies for STI control. The lack of proper risk-perception continues to impede women's ability to negotiate condom use with their partners. This paper is the outcome of secondary analysis of data collected in a cross-sectional survey that explored the perception of risk of STI and its influence on condom use among 504 pregnant women attending antenatal clinic at two health facilities in the Kintampo North Municipality. Consecutively, three Focus Group Discussions were conducted among 22 pregnant women which was analyzed using thematic analysis technique. Multivariate logistic regression analysis was used to identify possible predictors of condom use and risk of STI. Respondents mean age was 26.0 ± 5.9 years. 47% of respondents self-identified themselves as high risk for contracting STI, 50% of whom were married. High risk status (OR = 2.1, 95% CI: 1.1–4.4), ability to ask for condoms during sex (OR = 0.3, 95% CI: 0.1–0.73), and partner's approval of condom use (OR = 0.2, 95% CI: 0.01–0.05) were independent predictors of condom use. Condom use (OR 2.9 (1.5–5.7); p = 0.001) and marital status (engaged, OR 2.6 (1.5–4.5); p = 0.001) were independent predictors of risk of STI. Women who self-identified themselves as high risk for STI successfully negotiated condom use with their partners. This is however influenced by partner's approval and ability to convince partner to use condoms. Self-assessment of STI risk by women and the cooperation of male partners remain critical. PMID:28246570

  15. Perception of Climate Risk among Rural Farmers in Vietnam: Consistency within Households and with the Empirical Record.

    PubMed

    Cullen, Alison C; Anderson, C Leigh

    2017-03-01

    Rural farmers in Vietnamese communes perceive climate risk and potential impacts on livelihood within a complex context that may influence individual and household decisions. In a primary survey of 1,145 residents of the Thach Ha district of Ha Tinh province, we gathered data regarding perception about stability in climate, potential risks to livelihood, demographic characteristics, orientation toward risk, and interest in expanding economic activity. Temporal analysis of meteorological and economic indicator data forms an empirical basis for comparison with human perception. We ask the basic question: Are rural farmers' perceptions of climate consistent with the historical record and reproducible within households? We find that respondents do perceive climate anomalies, with some anchoring on recent extreme events as revealed by climate observational data, and further that spouses disproportionately share perceptions relative to randomly simulated pairings. To put climate-related risk perception in a larger context, we examine patterns across a range of risks to livelihood faced by farmers (livestock disease, pests, markets, health), using dimension reduction techniques. We find that our respondents distinguish among potential causes of low economic productivity, with substantial emphasis on climate-related impacts. They do not express uniform concern across risks, but rather average patterns reveal common modes and distinguish climate concern. Still, among those expressing concern about climate-related risks to livelihood we do not find an association with expressed intention to pursue changes in economic activity as a risk management response. © 2016 Society for Risk Analysis.

  16. Influence of the National Trauma Data Bank on the study of trauma outcomes: is it time to set research best practices to further enhance its impact?

    PubMed

    Haider, Adil H; Saleem, Taimur; Leow, Jeffrey J; Villegas, Cassandra V; Kisat, Mehreen; Schneider, Eric B; Haut, Elliott R; Stevens, Kent A; Cornwell, Edward E; MacKenzie, Ellen J; Efron, David T

    2012-05-01

    Risk-adjusted analyses are critical in evaluating trauma outcomes. The National Trauma Data Bank (NTDB) is a statistically robust registry that allows such analyses; however, analytical techniques are not yet standardized. In this study, we examined peer-reviewed manuscripts published using NTDB data, with particular attention to characteristics strongly associated with trauma outcomes. Our objective was to determine if there are substantial variations in the methodology and quality of risk-adjusted analyses and therefore, whether development of best practices for risk-adjusted analyses is warranted. A database of all studies using NTDB data published through December 2010 was created by searching PubMed and Embase. Studies with multivariate risk-adjusted analyses were examined for their central question, main outcomes measures, analytical techniques, covariates in adjusted analyses, and handling of missing data. Of 286 NTDB publications, 122 performed a multivariable adjusted analysis. These studies focused on clinical outcomes (51 studies), public health policy or injury prevention (30), quality (16), disparities (15), trauma center designation (6), or scoring systems (4). Mortality was the main outcome in 98 of these studies. There were considerable differences in the covariates used for case adjustment. The 3 covariates most frequently controlled for were age (95%), Injury Severity Score (85%), and sex (78%). Up to 43% of studies did not control for the 5 basic covariates necessary to conduct a risk-adjusted analysis of trauma mortality. Less than 10% of studies used clustering to adjust for facility differences or imputation to handle missing data. There is significant variability in how risk-adjusted analyses using data from the NTDB are performed. Best practices are needed to further improve the quality of research from the NTDB. Copyright © 2012 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Mediation of late adolescent health-risk behaviors and gender influences.

    PubMed

    Christopherson, Toni Michelle; Conner, Bradley T

    2012-11-01

    This study explored how multiple bioecological constructs operate to explain health-risk behaviors in late adolescence and to test for moderator effects of gender. This was a descriptive, cross-sectional study with a convenience sample of 437 predominately Caucasian late adolescents with an average age of 19 years who lived in Northern California. Parental Attachment, Shyness, Loneliness, Law Abidance, and Youth Risk Behaviors were measured with self-report tools and analyzed using structural equation modeling. Confirmatory factor analysis indicated that the data fit the model well. Analysis of group differences revealed that gender moderated the relationships among the measured variables; thus, data were analyzed in independent gender-based models. Structural modeling demonstrated good model fit for each gender. Shyness and parental attachment each were associated with loneliness. Loneliness was associated with smoking. Loneliness linked the relationship between shyness, parental attachment, and smoking. Parental attachment was associated with law abidance. Law abidance was associated with sexual behaviors for female adolescents only. This study provides valuable insights for public health nurses as it pertains to late adolescent health-risk behaviors. Nurses should use screening tools and techniques to ensure appropriate referrals and interventions to meet the needs of at-risk adolescents. © 2012 Wiley Periodicals, Inc.

  18. Molecular profiling--a tool for addressing emerging gaps in the comparative risk assessment of GMOs.

    PubMed

    Heinemann, Jack A; Kurenbach, Brigitta; Quist, David

    2011-10-01

    Assessing the risks of genetically modified organisms (GMOs) is required by both international agreement and domestic legislation. Many view the use of the "omics" tools for profiling classes of molecules as useful in risk assessment, but no consensus has formed on the need or value of these techniques for assessing the risks of all GMOs. In this and many other cases, experts support case-by-case use of molecular profiling techniques for risk assessment. We review the latest research on the applicability and usefulness of molecular profiling techniques for GMO risk assessment. As more and more kinds of GMOs and traits are developed, broader use of molecular profiling in a risk assessment may be required to supplement the comparative approach to risk assessment. The literature-based discussions on the use of profiling appear to have settled on two findings: 1. profiling techniques are reliable and relevant, at least no less so than other techniques used in risk assessment; and 2. although not required routinely, regulators should be aware of when they are needed. The dismissal of routine molecular profiling may be confusing to regulators who then lack guidance on when molecular profiling might be worthwhile. Molecular profiling is an important way to increase confidence in risk assessments if the profiles are properly designed to address relevant risks and are applied at the correct stage of the assessment. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  20. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  1. Skewed Riskscapes and Gentrified Inequities: Environmental Exposure Disparities in Seattle, Washington

    PubMed Central

    White, Jonah

    2011-01-01

    Objectives. Few studies have considered the sociohistorical intersection of environmental injustice and gentrification; a gap addressed by this case study of Seattle, Washington. This study explored the advantages of integrating air toxic risk screening with gentrification research to enhance proximity and health equity analysis methodologies. It was hypothesized that Seattle's industrial air toxic exposure risk was unevenly dispersed, that gentrification stratified the city's neighborhoods, and that the inequities of both converged. Methods. Spatial characterizations of air toxic pollution risk exposures from 1990 to 2007 were combined with longitudinal cluster analysis of census block groups in Seattle, Washington, from 1990 to 2000. Results. A cluster of air toxic exposure inequality and socioeconomic inequity converged in 1 area of south central Seattle. Minority and working class residents were more concentrated in the same neighborhoods near Seattle's worst industrial pollution risks. Conclusions. Not all pollution was distributed equally in a dynamic urban landscape. Using techniques to examine skewed riskscapes and socioeconomic urban geographies provided a foundation for future research on the connections among environmental health hazard sources, socially vulnerable neighborhoods, and health inequity. PMID:21836115

  2. Skewed riskscapes and gentrified inequities: environmental exposure disparities in Seattle, Washington.

    PubMed

    Abel, Troy D; White, Jonah

    2011-12-01

    Few studies have considered the sociohistorical intersection of environmental injustice and gentrification; a gap addressed by this case study of Seattle, Washington. This study explored the advantages of integrating air toxic risk screening with gentrification research to enhance proximity and health equity analysis methodologies. It was hypothesized that Seattle's industrial air toxic exposure risk was unevenly dispersed, that gentrification stratified the city's neighborhoods, and that the inequities of both converged. Spatial characterizations of air toxic pollution risk exposures from 1990 to 2007 were combined with longitudinal cluster analysis of census block groups in Seattle, Washington, from 1990 to 2000. A cluster of air toxic exposure inequality and socioeconomic inequity converged in 1 area of south central Seattle. Minority and working class residents were more concentrated in the same neighborhoods near Seattle's worst industrial pollution risks. Not all pollution was distributed equally in a dynamic urban landscape. Using techniques to examine skewed riskscapes and socioeconomic urban geographies provided a foundation for future research on the connections among environmental health hazard sources, socially vulnerable neighborhoods, and health inequity.

  3. [Comparison of 2 lacrimal punctal occlusion methods].

    PubMed

    Shalaby, O; Rivas, L; Rivas, A I; Oroza, M A; Murube, J

    2001-09-01

    To study and compare two methods for canalicular occlusion: Cautery and Punctal Patch. The study included fourty patients divided in two groups of 20 patients. The end point was 4 occluded puncti. The first group underwent deep cauterization resulting in occlusion of the full vertical aspect of the canaliculus. The second group underwent punctal patch technique for canalicular occlusion. Differential parameters were the following: time of intervention, ease of use, risks and precision. In the post operatory, discomfort, subjective and objective improvement in ocular surface as well as long term result of each technique was analysed. Time of intervention was longer for punctal patch compared to cautery. Both methods exhibited similar ease of use and improvement in ocular surface. Precision was high in punctal patch technique showing complete and final occlusion and no punctum needed reopening, while cautery technique presented 20% rate of reopening intervention. Postoperatory discomfort and irritation were remarkably evident with punctal technique, while minimal in cautery technique. Survival analysis after one year follow up, showed a higher rate of advantages for punctal patch technique over cautery technique.

  4. A comparative study of charge transfer inefficiency value and trap parameter determination techniques making use of an irradiated ESA-Euclid prototype CCD

    NASA Astrophysics Data System (ADS)

    Prod'homme, Thibaut; Verhoeve, P.; Kohley, R.; Short, A.; Boudin, N.

    2014-07-01

    The science objectives of space missions using CCDs to carry out accurate astronomical measurements are put at risk by the radiation-induced increase in charge transfer inefficiency (CTI) that results from trapping sites in the CCD silicon lattice. A variety of techniques are used to obtain CTI values and derive trap parameters, however they often differ in results. To identify and understand these differences, we take advantage of an on-going comprehensive characterisation of an irradiated Euclid prototype CCD including the following techniques: X-ray, trap pumping, flat field extended pixel edge response and first pixel response. We proceed to a comparative analysis of the obtained results.

  5. An information diffusion technique to assess integrated hazard risks.

    PubMed

    Huang, Chongfu; Huang, Yundong

    2018-02-01

    An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. SU-F-T-243: Major Risks in Radiotherapy. A Review Based On Risk Analysis Literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López-Tarjuelo, J; Guasp-Tortajada, M; Iglesias-Montenegro, N

    Purpose: We present a literature review of risk analyses in radiotherapy to highlight the most reported risks and facilitate the spread of this valuable information so that professionals can be aware of these major threats before performing their own studies. Methods: We considered studies with at least an estimation of the probability of occurrence of an adverse event (O) and its associated severity (S). They cover external beam radiotherapy, brachytherapy, intraoperative radiotherapy, and stereotactic techniques. We selected only the works containing a detailed ranked series of elements or failure modes and focused on the first fully reported quartile as much.more » Afterward, we sorted the risk elements according to a regular radiotherapy procedure so that the resulting groups were cited in several works and be ranked in this way. Results: 29 references published between 2007 and February 2016 were studied. Publication trend has been generally rising. The most employed analysis has been the Failure mode and effect analysis (FMEA). Among references, we selected 20 works listing 258 ranked risk elements. They were sorted into 31 groups appearing at least in two different works. 11 groups appeared in at least 5 references and 5 groups did it in 7 or more papers. These last sets of risks where choosing another set of images or plan for planning or treating, errors related with contours, errors in patient positioning for treatment, human mistakes when programming treatments, and planning errors. Conclusion: There is a sufficient amount and variety of references for identifying which failure modes or elements should be addressed in a radiotherapy department before attempting a specific analysis. FMEA prevailed, but other studies such as “risk matrix” or “occurrence × severity” analyses can also lead professionals’ efforts. Risk associated with human actions ranks very high; therefore, they should be automated or at least peer-reviewed.« less

  7. Managing total corporate electricity/energy market risks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henney, A.; Keers, G.

    1998-10-01

    The banking industry has developed a tool kit of very useful value at risk techniques for hedging risk, but these techniques must be adapted to the special complexities of the electricity market. This paper starts with a short history of the use of value-at-risk (VAR) techniques in banking risk management and then examines the specific and, in many instances, complex risk management challenges faced by electric companies from the behavior of prices in electricity markets and from the character of generation and electric retailing risks. The third section describes the main methods for making VAR calculations along with an analysismore » of their suitability for analyzing the risks of electricity portfolios and the case for using profit at risk and downside risk as measures of risk. The final section draws the threads together and explains how to look at managing total corporate electricity market risk, which is a big step toward managing total corporate energy market risk.« less

  8. [The Léon [correction of Laurent] Guedj implant concept: simplification of the surgical phase in implantology].

    PubMed

    Fabie, L; Guedj, L; Pichaud, Ch; Fabie, M

    2002-11-01

    We present a new self-drilling self-tapping dental implant that simplifies the operative technique and optimizes osseointegration. The implant, the instrumentation, and the operative technique are described. An experimental study was conducted in a sheep with pathological and histomorphological analysis at three months. A clinical evaluation was also conducted in 18 patients who had 27 implants. The experimental study demonstrated good quality osseointegration, without bone necrosis. Three sectors were identified. Histomorphometric analysis demonstrated that mean bone contact reached 40% on cancellous bone and 65% on cortical bone. In the clinical series, one implant had to be removed due to a problem with gum healing. All the other implants were well tolerated. The advantage of this new technique is the use of the implant as the drilling instrument. Much time is saved. In addition, the bone-implant contact is better since the bone cavity is exactly adapted to the implant. The risk of bone lesion is reduced due to the smaller number of drillings.

  9. [Application of text mining approach to pre-education prior to clinical practice].

    PubMed

    Koinuma, Masayoshi; Koike, Katsuya; Nakamura, Hitoshi

    2008-06-01

    We developed a new survey analysis technique to understand students' actual aims for effective pretraining prior to clinical practice. We asked third-year undergraduate students to write fixed-style complete and free sentences on "preparation of drug dispensing." Then, we converted their sentence data in to text style and performed Japanese-language morphologic analysis on the data using language analysis software. We classified key words, which were created on the basis of the word class information of the Japanese language morphologic analysis, into categories based on causes and characteristics. In addition to this, we classified the characteristics into six categories consisting of those concepts including "knowledge," "skill and attitude," "image," etc. with the KJ method technique. The results showed that the awareness of students of "preparation of drug dispensing" tended to be approximately three-fold more frequent in "skill and attitude," "risk," etc. than in "knowledge." Regarding the characteristics in the category of the "image," words like "hard," "challenging," "responsibility," "life," etc. frequently occurred. The results of corresponding analysis showed that the characteristics of the words "knowledge" and "skills and attitude" were independent. As the result of developing a cause-and-effect diagram, it was demonstrated that the phase "hanging tough" described most of the various factors. We thus could understand students' actual feelings by applying text-mining as a new survey analysis technique.

  10. A decision support framework for characterizing and managing dermal exposures to chemicals during Emergency Management and Operations.

    PubMed

    Dotson, G Scott; Hudson, Naomi L; Maier, Andrew

    2015-01-01

    Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management.

  11. A decision support framework for characterizing and managing dermal exposures to chemicals during Emergency Management and Operations

    PubMed Central

    Dotson, G. Scott; Hudson, Naomi L.; Maier, Andrew

    2016-01-01

    Emergency Management and Operations (EMO) personnel are in need of resources and tools to assist in understanding the health risks associated with dermal exposures during chemical incidents. This article reviews available resources and presents a conceptual framework for a decision support system (DSS) that assists in characterizing and managing risk during chemical emergencies involving dermal exposures. The framework merges principles of three decision-making techniques: 1) scenario planning, 2) risk analysis, and 3) multicriteria decision analysis (MCDA). This DSS facilitates dynamic decision making during each of the distinct life cycle phases of an emergency incident (ie, preparedness, response, or recovery) and identifies EMO needs. A checklist tool provides key questions intended to guide users through the complexities of conducting a dermal risk assessment. The questions define the scope of the framework for resource identification and application to support decision-making needs. The framework consists of three primary modules: 1) resource compilation, 2) prioritization, and 3) decision. The modules systematically identify, organize, and rank relevant information resources relating to the hazards of dermal exposures to chemicals and risk management strategies. Each module is subdivided into critical elements designed to further delineate the resources based on relevant incident phase and type of information. The DSS framework provides a much needed structure based on contemporary decision analysis principles for 1) documenting key questions for EMO problem formulation and 2) a method for systematically organizing, screening, and prioritizing information resources on dermal hazards, exposures, risk characterization, and management. PMID:26312660

  12. The assisted prediction modelling frame with hybridisation and ensemble for business risk forecasting and an implementation

    NASA Astrophysics Data System (ADS)

    Li, Hui; Hong, Lu-Yao; Zhou, Qing; Yu, Hai-Jie

    2015-08-01

    The business failure of numerous companies results in financial crises. The high social costs associated with such crises have made people to search for effective tools for business risk prediction, among which, support vector machine is very effective. Several modelling means, including single-technique modelling, hybrid modelling, and ensemble modelling, have been suggested in forecasting business risk with support vector machine. However, existing literature seldom focuses on the general modelling frame for business risk prediction, and seldom investigates performance differences among different modelling means. We reviewed researches on forecasting business risk with support vector machine, proposed the general assisted prediction modelling frame with hybridisation and ensemble (APMF-WHAE), and finally, investigated the use of principal components analysis, support vector machine, random sampling, and group decision, under the general frame in forecasting business risk. Under the APMF-WHAE frame with support vector machine as the base predictive model, four specific predictive models were produced, namely, pure support vector machine, a hybrid support vector machine involved with principal components analysis, a support vector machine ensemble involved with random sampling and group decision, and an ensemble of hybrid support vector machine using group decision to integrate various hybrid support vector machines on variables produced from principle components analysis and samples from random sampling. The experimental results indicate that hybrid support vector machine and ensemble of hybrid support vector machines were able to produce dominating performance than pure support vector machine and support vector machine ensemble.

  13. Stigma, social inequality, and HIV risk disclosure among Dominican male sex workers☆

    PubMed Central

    Padilla, Mark; Castellanos, Daniel; Guilamo-Ramos, Vincent; Reyes, Armando Matiz; Sánchez Marte, Leonardo E.; Soriano, Martha Arredondo

    2010-01-01

    Some quantitative behavioral studies in the USA have concluded that bisexually behaving Latino men are less likely than White men to disclose to their female partners that they have engaged in same-sex risk behavior and/or are HIV-positive, presumably exposing female partners to elevated risk for HIV infection. Nevertheless, very little theoretical or empirical research has been conducted to understand the social factors that promote or inhibit sexual risk disclosure among Latino men who have sex with men (MSM), and much of the existing literature has neglected to contextualize disclosure patterns within broader experiences of stigma and social inequality. This paper examines decisions about disclosure of sex work, same-sex behavior, and sexual risk for HIV among male sex workers in two cities in the Dominican Republic. Data derive from long-term ethnography and qualitative in-depth interviews with 72 male sex workers were used to analyze the relationships among experiences of stigma, social inequality, and patterns of sexual risk disclosure. Thematic analysis of interviews and ethnographic evidence revealed a wide range of stigma management techniques utilized by sex workers to minimize the effects of marginality due to their engagement in homosexuality and sex work. These techniques imposed severe constraints on men’s sexual risk disclosure, and potentially elevated their own and their female partners’ vulnerability to HIV infection. Based on the study’s findings, we conclude that future studies of sexual risk disclosure among ethnic minority MSM should avoid analyzing disclosure as a decontextualized variable, and should seek to examine sexual risk communication as a dynamic social process constrained by hierarchical systems of power and inequality. PMID:18410986

  14. Evaluation of 3D Ground Penetrating Radar Efficiency for Abandoned Tailings Pond Internal Structure Analysis and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Cortada, Unai; Martínez, Julián; Hidalgo, Mª Carmen; Rey, Javier

    2017-04-01

    Evaluation of 3D Ground Penetrating Radar Efficiency for Abandoned Tailings Pond Internal Structure Analysis and Risk Assessment Abandoned tailings ponds constitute a severe environmental problem in old Pb mining districts due to their high contents in metallic and semi-metallic elements. In most of the cases, there is a lack of information about the construction procedures and the previous environmental situation, which hinders the environmental risk evaluation. In these cases, Ground Penetrating Radar (GPR) could be an interesting technique to analyze the internal structure of the tailings ponds and detect vulnerable zones for leaching processes. Consequently, the GPR could help in the abandoned tailings ponds environmental risk assessment. In this study, a GPR 3D campaign was carried out with a 250 MHz frequency antenna in order to evaluate the efficiency of this technique in both the analysis of internal structures and the environmental risk assessment. Subsequently, 2D and 3D models were undertaken to represent graphically the obtained results. The studied tailings pond is located in the Guadiel river bank, a water course draining the mining district of Linares, Spain. The dam is 150 m length and 80 m width. The GPR 3D was done in a selected area near the central part of the pond. The analyzed grid was 25x50 m and the spacing of the slides was 1 m. The study revealed that the contact between the tailings and the substratum is located at 2.5 m. No intermediate layer was found, which means that the tailings pond was heightened on the fluvial terrace without any insulation system. Inside the first meter of the pond, a cross stratification was identified. The orientation of those laminations changed with the depth, which means that the stockpiling was performed from the different sides of the tailings pond. Furthermore, the direction of these stratifications is slightly concentric to the middle of the dam which could be associated with a central drainage system. Therefore, the internal zone of the tailings pond appears to be the most vulnerable for leaching processes that could contaminate the groundwater. Thus, this technique gave detailed information of the internal structure at the first meters despite the rapid attenuation of the GPR signal. In consequence, the GPR 3D with 250 MHz antenna appears to be effective for the detection of the tailings ponds cross stratification and the tailings-soil contact in dams with less than 5 meters of thickness.

  15. An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft

    NASA Technical Reports Server (NTRS)

    Olson, E. D.; Mavris, D. N.

    2000-01-01

    An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.

  16. Fetuin-A levels and risk of type 2 diabetes mellitus: a systematic review and meta-analysis.

    PubMed

    Guo, Vivian Yawei; Cao, Bing; Cai, Chunyan; Cheng, Kenneth King-Yip; Cheung, Bernard Man Yung

    2018-01-01

    Fetuin-A has been linked to insulin resistance and obesity. Its role in the pathogenesis of type 2 diabetes (T2DM) has also been discussed. We aimed to investigate the prospective association of fetuin-A and the risk of T2DM in a systematic review and meta-analysis. A systematic search of studies from the MEDLINE, EMBASE, Pubmed and Web of Science using fetuin-A, diabetes and various synonyms was conducted up to June 5, 2017. Relevant studies were extracted by two reviewers independently. The quality of studies was assessed using Newcastle-Ottawa scales. Overall estimates were pooled using fixed effect with inverse variance meta-analysis. Subgroup analyses by gender, study population, techniques of assessing fetuin-A, diabetes ascertainment methods, follow-up duration and measures of association were conducted. Seven studies comprising a total of 11,497 individuals and 2176 cases of T2DM were included in the systematic review and meta-analysis. Overall, one SD increment of fetuin-A level was associated with a 23% greater risk of incident T2DM (RR: 1.23, 95% CI 1.16-1.31). No significant heterogeneity or publication bias was found. The association was relatively stable across different subgroups. However, the association seemed only evident in women, but not in men. Higher circulating fetuin-A levels were associated with increased risk of T2DM. However, the causality deserved further analysis.

  17. Perceived Instrumentality and Normativeness of Corporal Punishment Use among Black Mothers

    PubMed Central

    Taylor, Catherine A.; Hamvas, Lauren; Paris, Ruth

    2011-01-01

    Corporal punishment (CP) remains highly prevalent in the U.S. despite its association with increased risk for child aggression and physical abuse. Five focus groups were conducted with parents (n=18) from a community at particularly high risk for using CP (Black, low socioeconomic status, Southern) in order to investigate their perceptions about why CP use is so common. A systematic qualitative analysis was conducted using grounded theory techniques within an overall thematic analysis. Codes were collapsed and two broad themes emerged. CP was perceived to be: 1) instrumental in achieving parenting goals and 2) normative within participants' key social identity groups, including race/ethnicity, religion, and family of origin. Implications for the reduction of CP are discussed using a social ecological framework. PMID:22707816

  18. When Do Simpler Sexual Behavior Data Collection Techniques Suffice?: An Analysis of Consequent Uncertainty in HIV Acquisition Risk Estimates

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Benotsch, Eric G.; Mikytuck, John

    2007-01-01

    The "gold standard" for evaluating human immunodeficiency virus (HIV) prevention programs is a partner-by-partner sexual behavior assessment that elicits information about each sex partner and the activities engaged in with that partner. When collection of detailed partner-by-partner data is not feasible, aggregate data (e.g., total…

  19. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest digital airborne sensors, both optical and lidar, to produce the input layer for surface water flood modelling. A national flood map product has been created. The new product utilises sophisticated modelling techniques, perfected over many years, which harness graphical processing power. This product will prove particularly valuable for risk assessment decision support within insurance/reinsurance, property/environmental, utilities, risk management and government agencies. However, it is not just the ground elevation that determines the behaviour of surface water. By combining height information (surface and terrain) with high resolution aerial photography and colour infrared imagery, a high definition land cover mapping dataset (LandBase) is being produced, which provides a precise measure of sealed versus non sealed surface. This will allows even more sophisticated modelling of flood scenarios. Thus, the value of airborne survey data can be demonstrated by flood risk analysis down to individual addresses in urban areas. However for some risks, an even more detailed survey may be justified. In order to achieve this, Infoterra is testing new 360˚ mobile lidar technology. Collecting lidar data from a moving vehicle allows each street to be mapped in very high detail, allowing precise information about the location, size and shape of features such as kerbstones, gullies, road camber and building threshold level to be captured quickly and accurately. These data can then be used to model the problem of overland flood risk at the scale of individual properties. Whilst at present it might be impractical to undertake such detailed modelling for all properties, these techniques can certainly be used to improve the flood risk analysis of key locations. This paper will demonstrate how these new high resolution remote sensing techniques can be combined to provide a new resolution of detail to aid urban flood modelling.

  20. Real-time assessment and neuromuscular training feedback techniques to prevent ACL injury in female athletes

    PubMed Central

    Myer, Gregory D.; Brent, Jensen L.; Ford, Kevin R.; Hewett, Timothy E.

    2011-01-01

    Lead Summary Some athletes may be more susceptible to at-risk knee positions during sports activities, but the underlying causes are not clearly defined. This manuscripts synthesizes in vivo, in vitro and in-silica (computer simulated) data to delineate likely risk factors to the mechanism(s) of non-contact ACL injuries. From these identified risk factors, we will discuss newly developed real-time screening techniques that can be used in training sessions to identify modifiable risk factors. Techniques provided will target and correct altered mechanics which may reduce or eliminate risk factors and aid in the prevention of non-contact ACL injuries in high risk athletes. PMID:21643474

  1. Surgical management of gynecomastia--a 10-year analysis.

    PubMed

    Handschin, A E; Bietry, D; Hüsler, R; Banic, A; Constantinescu, M

    2008-01-01

    Gynecomastia is defined as the benign enlargement of the male breast. Most studies on surgical treatment of gynecomastia show only small series and lack histopathology results. The aim of this study was to analyze the surgical approach in the treatment of gynecomastia and the related outcome over a 10-year period. All patients undergoing surgical gynecomastia corrections in our department between 1996 and 2006 were included for retrospective evaluation. The data were analyzed for etiology, stage of gynecomastia, surgical technique, complications, risk factors, and histological results. A total of 100 patients with 160 operations were included. Techniques included subcutaneous mastectomy alone or with additional hand-assisted liposuction, isolated liposuction, and formal breast reduction. Atypical histological findings were found in 3% of the patients (spindle-cell hemangioendothelioma, papilloma). The surgical revision rate among all patients was 7%. Body mass index and a weight of the resected specimen higher than 40 g were identified as significant risk factors for complications (p < 0.05). The treatment of gynecomastia requires an individualized approach. Caution must be taken in performing large resections, which are associated with increased complication rates. Histological tissue analysis should be routinely performed in all true gynecomastia corrections, because histological results may reveal atypical cellular pathology.

  2. A Survey of Injuries Affecting Pre-Professional Ballet Dancers.

    PubMed

    Caine, Dennis; Bergeron, Glen; Goodwin, Brett J; Thomas, Jessica; Caine, Caroline G; Steinfeld, Sam; Dyck, Kevin; André, Suzanne

    2016-01-01

    A cross-sectional design was employed retrospectively to evaluate injuries self-reported by 71 pre-professional ballet dancers over one season. Some of the descriptive findings of this survey were consistent with those of previous research and suggest particular demographic and injury trends in pre-professional ballet. These results include gender distribution, mean age and age range of participants, training hours, injury location, acute versus overuse injuries, as well as average number of physiotherapy treatments per dancer. Other results provide information that was heretofore unreported or inconsistent with previous investigations. These findings involved proportion of dancers injured, average number of injuries per dancer, overall injury incidence during an 8.5 month period, incidence rate by technique level, mean time loss per injury, proportion of recurrent injury, and activity practiced at time of injury. The results of univariate analyses revealed several significant findings, including a decrease in incidence rate of injury with increased months of experience in the pre-professional program, dancers having lower injury risk in rehearsal and performance than in class, and a reduced risk of injury for dancers at certain technique levels. However, only this latter finding remained significant in multivariate analysis. The results of this study underscore the importance of determining injury rates by gender, technique level, and activity setting in addition to overall injury rates. They also point to the necessity of looking at both overall and individual dancer-based injury risks.

  3. The Effect of Cooperative Gaming Techniques on Teacher Confidence toward At-Risk Students.

    ERIC Educational Resources Information Center

    Barrington, Kyle D.

    1995-01-01

    Three groups of secondary school teachers received different types of training with regard to teaching at-risk students. Teachers who learned techniques of experiential cooperative gaming for use with at-risk students demonstrated an increase in self-confidence in dealing with at-risk students, while teachers who received only information…

  4. Public Perception of Climate Risk: The Case of Greece

    NASA Astrophysics Data System (ADS)

    Voskaki, Asimina; Tsermenidis, Konstantinos

    2015-04-01

    Climate change is generally considered as one of the greatest challenges our world is facing. In the case of Greece climatic change seems to be associated with sea level rise, increase in temperature, variation in precipitation patterns, and extreme weather events. As a result of climate pattern changes a series of consequences are expected in areas involving build environment, infrastructures, health and various sectors of the economy. Even though climate change is probably going to affect Greece in terms of human welfare and economic growth, public perception and attitude do not always identify it as the most important, amongst others, environmental area of concern, or compared to various socio-economic issues. Considering that topics related to climate change involve a certain degree of uncertainty public perception seems to be important when dealing with adaptation strategies to manage or prevent risks from climate change impact and therefore people's reaction to risks seem to be an issue of great importance in future policy planning and implementation. The key issue of this paper is to investigate and analyse public perception in Greece as regards to climate change risk. Through a questionnaire survey this research investigates people's understanding, specific knowledge, opinion, awareness, emotions, behavior with regards to climate change risks and their willingness to pay in order to minimize or prevent risk. In addition, it examines people's willingness to alter current lifestyle and adapt to a changing climate. The information derived from survey data concern the topics and the perceived importance of the causes of the climate change between certain groups of people; the analysis of the data is focused on the correlation between perceived risk and knowledge about the issues involved. Rather than applying a specific technique extensively, we choose to deploy a number of methodologies with which we are able to draw different aspects from the data. To this end, we apply descriptive statistics, cluster analysis techniques and logistic regression. Descriptive statistics result in some general conclusions from the data concerning sex, age, location, residential characteristics, level of education and level of actual knowledge. Cluster analysis gives us an intuitive on how the subjects are grouped in certain profiles, according to their attitude towards climate change and the associated risk. Logistic regression provides a probabilistic approach in order to interpret the way the subjects respond to our questions in relation to their specific background. Based on analysis results, this paper, amongst others, points out the vulnerability of Greek society to climate risk and highlights factors that influence individual perception; in addition it identifies drivers of behavior change that can facilitate efficient adaptation plans for future use. The results of this research could be used as a basis for understanding public responses to climate change risk and for facilitating communication between experts, policy makers and communities.

  5. Perineal injury associated with hands on/hands poised and directed/undirected pushing: A retrospective cross-sectional study of non-operative vaginal births, 2011-2016.

    PubMed

    Lee, Nigel; Firmin, Meaghan; Gao, Yu; Kildea, Sue

    2018-07-01

    Clinicians hand position and advised pushing techniques may impact on rates of perineal injury. To assess the association of four techniques used in management of second stage with risk of moderate and severe perineal injury. Retrospective cross-sectional study. A metropolitan maternity hospital and a private maternity hospital in Brisbane, Australia. Term women with singleton, cephalic presentation experiencing a non-operative vaginal birth from January 2011 to December 2016. The research sites perinatal database recorded data on clinicians approach to instructing women during second stage and hand position at birth. Women were identified from matching the inclusion criteria (n = 26,393) then grouped based on combinations of hands-on, hand- poised, directed and undirected pushing. The associations with perineal injury were estimated using odds ratios obtained by multivariate analysis. Primary outcomes were the risk of moderate and severe perineal injury. The significance was set at 0.001. In Nulliparous women there was no difference in the risk of moderate or severe perineal injury between the different techniques. In multiparous women the use of a hands-on/directed approach was associated with a significant increase in the risk of moderate (AOR 1.18, 95% CI 1.10-1.27, p < 0.001) and severe perineal injury (AOR 1.50, 95% CI 1.20-1.88, p < 0.001) compared to hands-poised/undirected. A hands poised/undirected approach could be utilised in strategies for the prevention of moderate and severe perineal injury. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Role of new endoscopic techniques in inflammatory bowel disease management: Has the change come?

    PubMed

    Goran, Loredana; Negreanu, Lucian; Negreanu, Ana Maria

    2017-06-28

    Despite significant therapeutic progress in recent years, inflammatory bowel disease (IBD), which includes Crohn's disease and ulcerative colitis, remains a challenge regarding its pathogenesis and long-term complications. New concepts have emerged in the management of this disease, such as the "treat-to-target" concept, in which mucosal healing plays a key role in the evolution of IBD, the risk of recurrence and the need for surgery. Endoscopy is essential for the assessment of mucosal inflammation and plays a pivotal role in the analysis of mucosal healing in patients with IBD. Endoscopy is also essential in the detection of dysplasia and in the identification of the risk of colon cancer. The current surveillance strategy for dysplasia in IBD patients indicates white-light endoscopy with non-targeted biopsies. The new chromoendoscopy techniques provide substantial benefits for both clinicians and patients. Narrow-band imaging (NBI) has similar rates of dysplastic lesion detection as white-light endoscopy, and it seems that NBI identifies more adenoma-like lesions. Because it is used instinctively by many endoscopists, the combination of these two techniques might improve the rate of dysplasia detection. Flexible spectral imaging color enhancement can help differentiate dysplastic and non-dysplastic lesions and can also predict the risk of recurrence, which allows us to modulate the treatment to gain better control of the disease. The combination of non-invasive serum and stool biomarkers with endoscopy will improve the monitoring and limit the evolution of IBD because it enables the use of a personalized approach to each patient based on that patient's history and risk factors.

  7. Single, double or multiple-injection techniques for non-ultrasound guided axillary brachial plexus block in adults undergoing surgery of the lower arm.

    PubMed

    Chin, Ki Jinn; Alakkad, Husni; Cubillos, Javier E

    2013-08-08

    Regional anaesthesia comprising axillary block of the brachial plexus is a common anaesthetic technique for distal upper limb surgery. This is an update of a review first published in 2006 and updated in 2011. To compare the relative effects (benefits and harms) of three injection techniques (single, double and multiple) of axillary block of the brachial plexus for distal upper extremity surgery. We considered these effects primarily in terms of anaesthetic effectiveness; the complication rate (neurological and vascular); and pain and discomfort caused by performance of the block. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE, EMBASE and reference lists of trials. We contacted trial authors. The date of the last search was March 2013 (updated from March 2011). We included randomized controlled trials that compared double with single-injection techniques, multiple with single-injection techniques, or multiple with double-injection techniques for axillary block in adults undergoing surgery of the distal upper limb. We excluded trials using ultrasound-guided techniques. Independent study selection, risk of bias assessment and data extraction were performed by at least two investigators. We undertook meta-analysis. The 21 included trials involved a total of 2148 participants who received regional anaesthesia for hand, wrist, forearm or elbow surgery. Risk of bias assessment indicated that trial design and conduct were generally adequate; the most common areas of weakness were in blinding and allocation concealment.Eight trials comparing double versus single injections showed a statistically significant decrease in primary anaesthesia failure (risk ratio (RR 0.51), 95% confidence interval (CI) 0.30 to 0.85). Subgroup analysis by method of nerve location showed that the effect size was greater when neurostimulation was used rather than the transarterial technique.Eight trials comparing multiple with single injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.25, 95% CI 0.14 to 0.44) and of incomplete motor block (RR 0.61, 95% CI 0.39 to 0.96) in the multiple injection group.Eleven trials comparing multiple with double injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.28, 95% CI 0.20 to 0.40) and of incomplete motor block (RR 0.55, 95% CI 0.36 to 0.85) in the multiple injection group.Tourniquet pain was significantly reduced with multiple injections compared with double injections (RR 0.53, 95% CI 0.33 to 0.84). Otherwise there were no statistically significant differences between groups in any of the three comparisons on secondary analgesia failure, complications and patient discomfort. The time for block performance was significantly shorter for single and double injections compared with multiple injections. This review provides evidence that multiple-injection techniques using nerve stimulation for axillary plexus block produce more effective anaesthesia than either double or single-injection techniques. However, there was insufficient evidence for a significant difference in other outcomes, including safety.

  8. Duct-to-mucosa versus dunking techniques of pancreaticojejunostomy after pancreaticoduodenectomy: Do we need more trials? A systematic review and meta-analysis with trial sequential analysis.

    PubMed

    Kilambi, Ragini; Singh, Anand Narayan

    2018-03-25

    Pancreaticojejunostomy (PJ is the most widely used reconstruction technique after pancreaticoduodenectomy. Despite several randomized trials, the ideal technique of pancreaticojejunostomy remains debatable. We planned a meta-analysis of randomized trials comparing the two most common techniques of PJ (duct-to-mucosa and dunking) to identify the best available evidence in the current literature. We searched the Pubmed/Medline, Web of science, Science citation index, Google scholar and Cochrane Central Register of Controlled Trials electronic databases till October 2017 for all English language randomized trials comparing the two approaches. Statistical analysis was performed using Review Manager (RevMan), Version 5.3. Copenhagen: The Nordic Cochrane Center, The Cochrane Collaboration, 2014 and results were expressed as odds ratio for dichotomous and mean difference for continuous variables. P-value ≤ 0.05 was considered significant. Trial sequential analysis was performed using TSA version 0.9.5.5 (Copenhagen: The Copenhagen Trial Unit, Center for Clinical Intervention Research, 2016). A total of 8 trials were included, with a total of 1043 patients (DTM: 518; Dunking: 525). There was no significant difference between the two groups in terms of overall as well as clinically relevant POPF rate. Similarly, both groups were comparable for the secondary outcomes. Trial sequential analysis revealed that the required information size had been crossed without achieving a clinically significant difference for overall POPF; and though the required information size had not been achieved for CR-POPF, the current data has already crossed the futility line for CR-POPF with a 10% risk difference, 80% power and 5% α error. This meta-analysis found no significant difference between the two techniques in terms of overall and CR-POPF rates. Further, the existing evidence is sufficient to conclude lack of difference and further trials are unlikely to result in any change in the outcome. (CRD42017074886). © 2018 Wiley Periodicals, Inc.

  9. [Knowledge of breast cancer risk factors as one of the conditions in undertaking prophylactic treatments among midwives].

    PubMed

    Iwanowicz-Palus, Grazyna J; Skurzak, Agnieszka

    2004-01-01

    The aim of the study was to estimate the knowledge of breast cancer risk factors among midwives on different education level. A diagnostic survey was undertaken with the use of questionnaire technique among 186 persons representing different education level of midwife profession--licentiate students (37.63%), master's degree students (29.03%) and participants of family nursing course (33.33%). The collected data were submitted to statistic analysis and chi2 test was used to check the significance of investigated features. The general knowledge about breast cancer risk factors among persons representing different education level of midwife profession is satisfactory. The stage of education correlates with the level of knowledge about risk factors connected with family transmission, the age influence, menopause time and breast self-examination in the investigated group (p < 0.05).

  10. Evaluating the risk of water distribution system failure: A shared frailty model

    NASA Astrophysics Data System (ADS)

    Clark, Robert M.; Thurnau, Robert C.

    2011-12-01

    Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.

  11. Nuclear Radiation Fields on the Mars Surface: Risk Analysis for Long-term Living Environment

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke M.; Clowdsley, Martha S.; Qualls, Garry D.; Nealy, John E.

    2005-01-01

    Mars, our nearest planet outward from the sun, has been targeted for several decades as a prospective site for expanded human habitation. Background space radiation exposures on Mars are expected to be orders of magnitude higher than on Earth. Recent risk analysis procedures based on detailed dosimetric techniques applicable to sensitive human organs have been developed along with experimental data regarding cell mutation rates resulting from exposures to a broad range of particle types and energy spectra. In this context, simulated exposure and subsequent risk for humans in residence on Mars are examined. A conceptual habitat structure, CAD-modeled with duly considered inherent shielding properties, has been implemented. Body self-shielding is evaluated using NASA standard computerized male and female models. The background environment is taken to consist not only of exposure from incident cosmic ray ions and their secondaries, but also include the contribution from secondary neutron fields produced in the tenuous atmosphere and the underlying regolith.

  12. Neurologic complications after off-pump coronary artery bypass grafting with and without aortic manipulation: meta-analysis of 11,398 cases from 8 studies.

    PubMed

    Misfeld, Martin; Brereton, R John L; Sweetman, Elizabeth A; Doig, Gordon S

    2011-08-01

    Neurologic complications after coronary artery bypass grafting remain a concern. Off-pump coronary artery bypass grafting is a surgical strategy proposed to decrease this risk. Use of an off-pump anaortic technique, which leaves the ascending aorta untouched, may result in further reductions. This systematic review of all published evidence compares neurologic complications after anaortic off-pump coronary artery bypass grafting versus that with aortic manipulation. PubMed and Embase were searched up to August 2008. Experts were contacted, and reference lists of retrieved articles were hand searched. The search process was not limited to English-language sources. Observational studies comparing standard off-pump coronary artery bypass grafting technique with anaortic technique were eligible for inclusion if they reported neurologic complications (stroke and transient ischemic attack). Meta-analysis was conducted to assess differences between groups with regard to neurologic complications. Electronic search identified 1428 abstracts, which resulted in retrieval and detailed review of 331 full-text articles. Eight observational studies reported neurologic complications in 5619 anaortic off-pump coronary artery bypass grafting cases and 5779 cases with aortic manipulation. Postsurgical neurologic complications were significantly lower in anaortic off-pump coronary artery bypass grafting cases (odds ratio, 0.46; 95% confidence interval, 0.29-0.72; I(2) = 0.8%; P = .0008). Avoidance of aortic manipulation during off-pump coronary artery bypass grafting decreases neurologic complications relative to standard technique in which the ascending aorta is manipulated. In patients at high risk for stroke or transient ischemic attack, we recommend avoidance of aortic manipulation during off-pump coronary artery bypass grafting. Copyright © 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  13. Methodology for heritage conservation in Belgium based on multi-temporal interferometry

    NASA Astrophysics Data System (ADS)

    Bejarano-Urrego, L.; Verstrynge, E.; Shimoni, M.; Lopez, J.; Walstra, J.; Declercq, P.-Y.; Derauw, D.; Hayen, R.; Van Balen, K.

    2017-09-01

    Soil differential settlements that cause structural damage to heritage buildings are precipitating cultural and economic value losses. Adequate damage assessment as well as protection and preservation of the built patrimony are priorities at national and local levels, so they require advanced integration and analysis of environmental, architectural and historical parameters. The GEPATAR project (GEotechnical and Patrimonial Archives Toolbox for ARchitectural conservation in Belgium) aims to create an online interactive geo-information tool that allows the user to view and to be informed about the Belgian heritage buildings at risk due to differential soil settlements. Multi-temporal interferometry techniques (MTI) have been proven to be a powerful technique for analyzing earth surface deformation patterns through time series of Synthetic Aperture Radar (SAR) images. These techniques allow to measure ground movements over wide areas at high precision and relatively low cost. In this project, Persistent Scatterer Synthetic Aperture Radar Interferometry (PS-InSAR) and Multidimensional Small Baseline Subsets (MSBAS) are used to measure and monitor the temporal evolution of surface deformations across Belgium. This information is integrated with the Belgian heritage data by means of an interactive toolbox in a GIS environment in order to identify the level of risk. At country scale, the toolbox includes ground deformation hazard maps, geological information, location of patrimony buildings and land use; while at local scale, it includes settlement rates, photographic and historical surveys as well as architectural and geotechnical information. Some case studies are investigated by means of on-site monitoring techniques and stability analysis to evaluate the applied approaches. This paper presents a description of the methodology being implemented in the project together with the case study of the Saint Vincent's church which is located on a former colliery zone. For this building, damage is assessed by means of PSInSAR.

  14. Human health risk assessment with spatial analysis: study of a population chronically exposed to arsenic through drinking water from Argentina.

    PubMed

    Navoni, J A; De Pietri, D; Olmos, V; Gimenez, C; Bovi Mitre, G; de Titto, E; Villaamil Lepori, E C

    2014-11-15

    Arsenic (As) is a ubiquitous element widely distributed in the environment. This metalloid has proven carcinogenic action in man. The aim of this work was to assess the health risk related to As exposure through drinking water in an Argentinean population, applying spatial analytical techniques in addition to conventional approaches. The study involved 650 inhabitants from Chaco and Santiago del Estero provinces. Arsenic in drinking water (Asw) and urine (UAs) was measured by hydride generation atomic absorption spectrophotometry. Average daily dose (ADD), hazard quotient (HQ), and carcinogenic risk (CR) were estimated, geo-referenced and integrated with demographical data by a health composite index (HI) applying geographic information system (GIS) analysis. Asw covered a wide range of concentration: from non-detectable (ND) to 2000 μg/L. More than 90% of the population was exposed to As, with UAs levels above the intervention level of 100 μg/g creatinine. GIS analysis described an expected level of exposure lower than the observed, indicating possible additional source/s of exposure to inorganic arsenic. In 68% of the locations, the population had a HQ greater than 1, and the CR ranged between 5·10(-5) and 2,1·10(-2). An environmental exposure area through ADD geo-referencing defined a baseline scenario for space-time risk assessment. The time of residence, the demographic density and the potential health considered outcomes helped characterize the health risk in the region. The geospatial analysis contributed to delimitate and analyze the change tendencies of risk in the region, broadening the scopes of the results for a decision-making process. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Environmental mediation: A method for protecting environmental sciences and scientists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigerstad, T.J.; Berdt Romilly, G. de; MacKeigan, P.

    1995-12-31

    The primary role for scientific analysis of environmental and human risks has been to support decisions that have arisen out of a regulatory decision-making model called ``Command and Control`` or ``Decide and Defend``. A project or a policy is proposed and permission for its implementation is sought. Permission-gaining sometimes requires a number of technical documents: Environmental Impact Statements, Public Health Risk Evaluations, policy analysis documents. Usually, little of this analysis is used to make any real decisions. This is a fact that has lead to enormous frustration and an atmosphere of distrust of government, industry and consulting scientists. There havemore » been a number of responses by governmental and industrial managers, some scientists, and even the legal system, to mitigate the frustration and distrust. One response has been to develop methods of packaging information using language which is considered more ``understandable`` to the public: Ecosystem Health, Social Risk Assessment, Economic Risk Management, Enviro-hazard Communication, Risk Focus Analysis, etc. A second is to develop more sophisticated persuasion techniques-a potential misuse of Risk Communication. A third is proposing to change the practice of science itself: e.g., ``post-normal science`` and ``popular epidemiology``. A fourth has been to challenge the definition of ``expert`` in legal proceedings. All of these approaches do not appear to address the underlying issue: lack of trust and credibility. To address this issue requires an understanding of the nature of environmental disputes and the development of an atmosphere of trust and credibility. The authors propose Environmental Mediation as a response to the dilemma faced by professional environmental scientists, engineers, and managers that protects the professionals and their disciplines.« less

  16. Gis-Based Multi-Criteria Decision Analysis for Forest Fire Risk Mapping

    NASA Astrophysics Data System (ADS)

    Akay, A. E.; Erdoğan, A.

    2017-11-01

    The forested areas along the coastal zone of the Mediterranean region in Turkey are classified as first-degree fire sensitive areas. Forest fires are major environmental disaster that affects the sustainability of forest ecosystems. Besides, forest fires result in important economic losses and even threaten human lives. Thus, it is critical to determine the forested areas with fire risks and thereby minimize the damages on forest resources by taking necessary precaution measures in these areas. The risk of forest fire can be assessed based on various factors such as forest vegetation structures (tree species, crown closure, tree stage), topographic features (slope and aspect), and climatic parameters (temperature, wind). In this study, GIS-based Multi-Criteria Decision Analysis (MCDA) method was used to generate forest fire risk map. The study was implemented in the forested areas within Yayla Forest Enterprise Chiefs at Dursunbey Forest Enterprise Directorate which is classified as first degree fire sensitive area. In the solution process, "extAhp 2.0" plug-in running Analytic Hierarchy Process (AHP) method in ArcGIS 10.4.1 was used to categorize study area under five fire risk classes: extreme risk, high risk, moderate risk, and low risk. The results indicated that 23.81 % of the area was of extreme risk, while 25.81 % was of high risk. The result indicated that the most effective criterion was tree species, followed by tree stages. The aspect had the least effective criterion on forest fire risk. It was revealed that GIS techniques integrated with MCDA methods are effective tools to quickly estimate forest fire risk at low cost. The integration of these factors into GIS can be very useful to determine forested areas with high fire risk and also to plan forestry management after fire.

  17. Surgery for left ventricular aneurysm: early and late survival after simple linear repair and endoventricular patch plasty.

    PubMed

    Lundblad, Runar; Abdelnoor, Michel; Svennevig, Jan Ludvig

    2004-09-01

    Simple linear resection and endoventricular patch plasty are alternative techniques to repair postinfarction left ventricular aneurysm. The aim of the study was to compare these 2 methods with regard to early mortality and long-term survival. We retrospectively reviewed 159 patients undergoing operations between 1989 and 2003. The epidemiologic design was of an exposed (simple linear repair, n = 74) versus nonexposed (endoventricular patch plasty, n = 85) cohort with 2 endpoints: early mortality and long-term survival. The crude effect of aneurysm repair technique versus endpoint was estimated by odds ratio, rate ratio, or relative risk and their 95% confidence intervals. Stratification analysis by using the Mantel-Haenszel method was done to quantify confounders and pinpoint effect modifiers. Adjustment for multiconfounders was performed by using logistic regression and Cox regression analysis. Survival curves were analyzed with the Breslow test and the log-rank test. Early mortality was 8.2% for all patients, 13.5% after linear repair and 3.5% after endoventricular patch plasty. When adjusted for multiconfounders, the risk of early mortality was significantly higher after simple linear repair than after endoventricular patch plasty (odds ratio, 4.4; 95% confidence interval, 1.1-17.8). Mean follow-up was 5.8 +/- 3.8 years (range, 0-14.0 years). Overall 5-year cumulative survival was 78%, 70.1% after linear repair and 91.4% after endoventricular patch plasty. The risk of total mortality was significantly higher after linear repair than after endoventricular patch plasty when controlled for multiconfounders (relative risk, 4.5; 95% confidence interval, 2.0-9.7). Linear repair dominated early in the series and patch plasty dominated later, giving a possible learning-curve bias in favor of patch plasty that could not be adjusted for in the regression analysis. Postinfarction left ventricular aneurysm can be repaired with satisfactory early and late results. Surgical risk was lower and long-term survival was higher after endoventricular patch plasty than simple linear repair. Differences in outcome should be interpreted with care because of the retrospective study design and the chronology of the 2 repair methods.

  18. Multifactor valuation models of energy futures and options on futures

    NASA Astrophysics Data System (ADS)

    Bertus, Mark J.

    The intent of this dissertation is to investigate continuous time pricing models for commodity derivative contracts that consider mean reversion. The motivation for pricing commodity futures and option on futures contracts leads to improved practical risk management techniques in markets where uncertainty is increasing. In the dissertation closed-form solutions to mean reverting one-factor, two-factor, three-factor Brownian motions are developed for futures contracts. These solutions are obtained through risk neutral pricing methods that yield tractable expressions for futures prices, which are linear in the state variables, hence making them attractive for estimation. These functions, however, are expressed in terms of latent variables (i.e. spot prices, convenience yield) which complicate the estimation of the futures pricing equation. To address this complication a discussion on Dynamic factor analysis is given. This procedure documents latent variables using a Kalman filter and illustrations show how this technique may be used for the analysis. In addition, to the futures contracts closed form solutions for two option models are obtained. Solutions to the one- and two-factor models are tailored solutions of the Black-Scholes pricing model. Furthermore, since these contracts are written on the futures contracts, they too are influenced by the same underlying parameters of the state variables used to price the futures contracts. To conclude, the analysis finishes with an investigation of commodity futures options that incorporate random discrete jumps.

  19. Selecting Strategies to Reduce High-Risk Unsafe Work Behaviors Using the Safety Behavior Sampling Technique and Bayesian Network Analysis.

    PubMed

    Ghasemi, Fakhradin; Kalatpour, Omid; Moghimbeigi, Abbas; Mohammadfam, Iraj

    2017-03-04

    High-risk unsafe behaviors (HRUBs) have been known as the main cause of occupational accidents. Considering the financial and societal costs of accidents and the limitations of available resources, there is an urgent need for managing unsafe behaviors at workplaces. The aim of the present study was to find strategies for decreasing the rate of HRUBs using an integrated approach of safety behavior sampling technique and Bayesian networks analysis. A cross-sectional study. The Bayesian network was constructed using a focus group approach. The required data was collected using the safety behavior sampling, and the parameters of the network were estimated using Expectation-Maximization algorithm. Using sensitivity analysis and belief updating, it was determined that which factors had the highest influences on unsafe behavior. Based on BN analyses, safety training was the most important factor influencing employees' behavior at the workplace. High quality safety training courses can reduce the rate of HRUBs about 10%. Moreover, the rate of HRUBs increased by decreasing the age of employees. The rate of HRUBs was higher in the afternoon and last days of a week. Among the investigated variables, training was the most important factor affecting safety behavior of employees. By holding high quality safety training courses, companies would be able to reduce the rate of HRUBs significantly.

  20. The Analysis of Rush Orders Risk in Supply Chain: A Simulation Approach

    NASA Technical Reports Server (NTRS)

    Mahfouz, Amr; Arisha, Amr

    2011-01-01

    Satisfying customers by delivering demands at agreed time, with competitive prices, and in satisfactory quality level are crucial requirements for supply chain survival. Incidence of risks in supply chain often causes sudden disruptions in the processes and consequently leads to customers losing their trust in a company's competence. Rush orders are considered to be one of the main types of supply chain risks due to their negative impact on the overall performance, Using integrated definition modeling approaches (i.e. IDEF0 & IDEF3) and simulation modeling technique, a comprehensive integrated model has been developed to assess rush order risks and examine two risk mitigation strategies. Detailed functions sequence and objects flow were conceptually modeled to reflect on macro and micro levels of the studied supply chain. Discrete event simulation models were then developed to assess and investigate the mitigation strategies of rush order risks, the objective of this is to minimize order cycle time and cost.

  1. MEDIASSIST: medical assistance for intraoperative skill transfer in minimally invasive surgery using augmented reality

    NASA Astrophysics Data System (ADS)

    Sudra, Gunther; Speidel, Stefanie; Fritz, Dominik; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2007-03-01

    Minimally invasive surgery is a highly complex medical discipline with various risks for surgeon and patient, but has also numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate with these new problems, we propose to support the surgeon's spatial cognition by using augmented reality (AR) techniques to directly visualize virtual objects in the surgical site. In order to generate an intelligent support, it is necessary to have an intraoperative assistance system that recognizes the surgical skills during the intervention and provides context-aware assistance surgeon using AR techniques. With MEDIASSIST we bundle our research activities in the field of intraoperative intelligent support and visualization. Our experimental setup consists of a stereo endoscope, an optical tracking system and a head-mounted-display for 3D visualization. The framework will be used as platform for the development and evaluation of our research in the field of skill recognition and context-aware assistance generation. This includes methods for surgical skill analysis, skill classification, context interpretation as well as assistive visualization and interaction techniques. In this paper we present the objectives of MEDIASSIST and first results in the fields of skill analysis, visualization and multi-modal interaction. In detail we present a markerless instrument tracking for surgical skill analysis as well as visualization techniques and recognition of interaction gestures in an AR environment.

  2. Laparoscopic Nissen (total) versus anterior 180° fundoplication for gastro-esophageal reflux disease: A meta-analysis and systematic review.

    PubMed

    Du, Xing; Wu, Ji-Min; Hu, Zhi-Wei; Wang, Feng; Wang, Zhong-Gao; Zhang, Chao; Yan, Chao; Chen, Mei-Ping

    2017-09-01

    Laparoscopic Nissen fundoplication (LNF) has been the gold standard for the surgical management of Gastro-esophageal reflux disease (GERD). Laparoscopic anterior 180° fundoplication (180° LAF) is reported to reduce the incidence of postoperative complications while obtaining similar control of reflux. The present meta-analysis was conducted to confirm the value of the 2 techniques. PubMed, Medline, Embase, Cochrane Library, Springerlink, and China National Knowledge Infrastructure Platform databases were searched for randomized controlled trials (RCTs) comparing LNF and 180° LAF. Data regarding the benefits and adverse results of 2 techniques were extracted and compared using a meta-analysis. Six eligible RCTs comparing LNF (n = 266) and 180° LAF (n = 265) were identified. There were no significant differences between LNF and 180° LAF with regard to operating time, perioperative complications, length of hospital stay, patient satisfaction, willingness to undergo surgery again, quality of life, postoperative heartburn, proton pump inhibitor (PPI) use, postoperative DeMeester scores, postoperative lower esophageal sphincter (LES) pressure, postoperative gas-bloating, unable to belch, diarrhea, or overall reoperation. LNF was associated with a higher prevalence of postoperative dysphagia compared with 180° LAF, while 180° LAF was followed by more reoperation for recurrent reflux symptoms. LNF and 180° LAF are equally effective in controlling reflux symptoms and obtain a comparable prevalence of patient satisfaction. 180° LAF can reduce the incidence of postoperative dysphagia while this is offset by a higher risk of reoperation for recurrent symptoms. The risk of recurrent symptoms should need to be balanced against the risk of dysphagia when surgeons choose surgical procedures for each individual with GERD.

  3. Investigation of fault modes in permanent magnet synchronous machines for traction applications

    NASA Astrophysics Data System (ADS)

    Choi, Gilsu

    Over the past few decades, electric motor drives have been more widely adopted to power the transportation sector to reduce our dependence on foreign oil and carbon emissions. Permanent magnet synchronous machines (PMSMs) are popular in many applications in the aerospace and automotive industries that require high power density and high efficiency. However, the presence of magnets that cannot be turned off in the event of a fault has always been an issue that hinders adoption of PMSMs in these demanding applications. This work investigates the design and analysis of PMSMs for automotive traction applications with particular emphasis on fault-mode operation caused by faults appearing at the terminals of the machine. New models and analytical techniques are introduced for evaluating the steady-state and dynamic response of PMSM drives to various fault conditions. Attention is focused on modeling the PMSM drive including nonlinear magnetic behavior under several different fault conditions, evaluating the risks of irreversible demagnetization caused by the large fault currents, as well as developing fault mitigation techniques in terms of both the fault currents and demagnetization risks. Of the major classes of machine terminal faults that can occur in PMSMs, short-circuit (SC) faults produce much more dangerous fault currents than open-circuit faults. The impact of different PMSM topologies and parameters on their responses to symmetrical and asymmetrical short-circuit (SSC & ASC) faults has been investigated. A detailed investigation on both the SSC and ASC faults is presented including both closed-form and numerical analysis. The demagnetization characteristics caused by high fault-mode stator currents (i.e., armature reaction) for different types of PMSMs are investigated. A thorough analysis and comparison of the relative demagnetization vulnerability for different types of PMSMs is presented. This analysis includes design guidelines and recommendations for minimizing the demagnetization risks while examining corresponding trade-offs. Two PM machines have been tested to validate the predicted fault currents and braking torque as well as demagnetization risks in PMSM drives. The generality and scalability of key results have also been demonstrated by analyzing several PM machines with a variety of stator, rotor, and winding configurations for various power ratings.

  4. Management of the second phase of labour: perineum protection techniques.

    PubMed

    Laganà, A S; Burgio, M A; Retto, G; Pizzo, A; Granese, R; Sturlese, E; Ciancimino, L; Chiofalo, B; Retto, A; Triolo, O

    2015-06-01

    The obstetric experience alongside scientific evidences in literature indicate several management techniques during the expulsive period of labour to minimize obstetric complications. Among the various methods that can be used for the protection of the perineum during the expulsive phase, some are performed prepartum (perineum massage), while most are used during childbirth. Among the second group, progressively increasing importance is assumed by the manual techniques to protect the perineum (using the "hands-on" and "hands-off") and by episiotomy. These techniques, when used in accordance to the guidelines, may favour the reduction of adverse outcomes for both the mother and the newborn, both immediately after birth and after a longer time. The midwife should be aware of the evidences in literature so that a critical analysis of the available techniques can be made and put in action during the expulsive phase in order to protect the mother and the foetus from any unfavourable outcomes. Currently, clinical evidence in literature is directing obstetric and medical staff towards a careful analysis of the maternal-foetal parameters, in order to achieve a precise assessment of the risks factors of intrapartum and postpartum outcomes. Increasingly, there is the need for close collaboration between the midwife and medical staff to ensure proper personalized assistance based on the peculiar characteristics of the woman and the fetus.

  5. Risk Assessment of Alzheimer's Disease using the Information Diffusion Model from Structural Magnetic Resonance Imaging.

    PubMed

    Beheshti, Iman; Olya, Hossain G T; Demirel, Hasan

    2016-04-05

    Recently, automatic risk assessment methods have been a target for the detection of Alzheimer's disease (AD) risk. This study aims to develop an automatic computer-aided AD diagnosis technique for risk assessment of AD using information diffusion theory. Information diffusion is a fuzzy mathematics logic of set-value that is used for risk assessment of natural phenomena, which attaches fuzziness (uncertainty) and incompleteness. Data were obtained from voxel-based morphometry analysis of structural magnetic resonance imaging. The information diffusion model results revealed that the risk of AD increases with a reduction of the normalized gray matter ratio (p > 0.5, normalized gray matter ratio <40%). The information diffusion model results were evaluated by calculation of the correlation of two traditional risk assessments of AD, the Mini-Mental State Examination and the Clinical Dementia Rating. The correlation results revealed that the information diffusion model findings were in line with Mini-Mental State Examination and Clinical Dementia Rating results. Application of information diffusion model contributes to the computerization of risk assessment of AD, which has a practical implication for the early detection of AD.

  6. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  7. Poverty Risk Index as A New Methodology for Social Inequality Distribution Assessment

    NASA Astrophysics Data System (ADS)

    Swiader, Małgorzata; Szewrański, Szymon; Kazak, Jan

    2017-10-01

    The paper presents new concept of poverty risk index measurement due to dynamics of urban development among years. The rapid urbanization could seriously surpass the capacity of the most cities, which may lead to insufficient services of their inhabitants. Consequence of this situation could be polarized, social differentiated cities with high rates of urban poverty. The measurement and analysis of urban poverty phenomenon requires the dedicated tools and techniques. The data based assessment could allow planners and public policy makers to develop more socially integrated cities. This paper presents analysis of urban poverty phenomenon in Wrocław city (Poland) during period 2010-2012. This analysis was conducted for ten Social Assistance Terrain Units (SATU) delineated at the city area. Our primary study objective concerns the proposal and calculation of poverty risk index based on diagnostic features, which represent the most common causes of social benefits granting, as: number of single households granted permanent benefits, number of people in families granted permanent benefits, number of people in families granted temporary benefits due to unemployment, number of people in families granted temporary benefits due to disability, number of people in families granted meals for children. The calculation was conducted by using the theory of development pattern - Hellwig’s economic development measure. The analysis of poverty risk index showed that commonly the central and south-eastern part of the city is characterized by the highest poverty risk index. The obtained results of the inequalities spatial distribution relate to European and American patterns of poverty concentration in urban structures.

  8. Poster - 30: Use of a Hazard-Risk Analysis for development of a new eye immobilization tool for treatment of choroidal melanoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prooijen, Monique van; Breen, Stephen

    Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less

  9. Brain tumor classification using AFM in combination with data mining techniques.

    PubMed

    Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt

    2013-01-01

    Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.

  10. Pancreatic thickness as a predictive factor for postoperative pancreatic fistula after distal pancreatectomy using an endopath stapler.

    PubMed

    Okano, Keiichi; Oshima, Minoru; Kakinoki, Keitaro; Yamamoto, Naoki; Akamoto, Shintaro; Yachida, Shinichi; Hagiike, Masanobu; Kamada, Hideki; Masaki, Tsutomu; Suzuki, Yasuyuki

    2013-02-01

    No consistent risk factor has yet been established for the development of pancreatic fistula (PF) after distal pancreatectomy (DP) with a stapler. A total of 31 consecutive patients underwent DP with an endopath stapler between June 2006 and December 2010 using a slow parenchymal flattening technique. The risk factors for PF after DP with an endopath stapler were identified based on univariate and multivariate analyses. Clinical PF developed in 7 of 31 (22 %) patients who underwent DP with a stapler. The pancreata were significantly thicker at the transection line in patients with PF (19.4 ± 1.47 mm) in comparison to patients without PF (12.6 ± 0.79 mm; p = 0.0003). A 16-mm cut-off for pancreatic thickness was established based on the receiver operating characteristic (ROC) curve; the area under the ROC curve was 0.875 (p = 0.0215). Pancreatic thickness (p = 0.0006) and blood transfusion (p = 0.028) were associated with postoperative PF in a univariate analysis. Pancreatic thickness was the only significant independent factor (odds ratio 9.99; p = 0.036) according to a multivariate analysis with a specificity of 72 %, and a sensitivity of 85 %. Pancreatic thickness is a significant independent risk factor for PF development after DP with an endopath stapler. The stapler technique is thus considered to be an appropriate modality in patients with a pancreatic thicknesses of <16 mm.

  11. A novel pretreatment method combining sealing technique with direct injection technique applied for improving biosafety.

    PubMed

    Wang, Xinyu; Gao, Jing-Lin; Du, Chaohui; An, Jing; Li, MengJiao; Ma, Haiyan; Zhang, Lina; Jiang, Ye

    2017-01-01

    People today have a stronger interest in the risk of biosafety in clinical bioanalysis. A safe, simple, effective method of preparation is needed urgently. To improve biosafety of clinical analysis, we used antiviral drugs of adefovir and tenofovir as model drugs and developed a safe pretreatment method combining sealing technique with direct injection technique. The inter- and intraday precision (RSD %) of the method were <4%, and the extraction recoveries ranged from 99.4 to 100.7%. Meanwhile, the results showed that standard solution could be used to prepare calibration curve instead of spiking plasma, acquiring more accuracy result. Compared with traditional methods, the novel method not only improved biosecurity of the pretreatment method significantly, but also achieved several advantages including higher precision, favorable sensitivity and satisfactory recovery. With these highly practical and desirable characteristics, the novel method may become a feasible platform in bioanalysis.

  12. Posthemispherectomy hydrocephalus: results of a comprehensive, multiinstitutional review.

    PubMed

    Lew, Sean M; Matthews, Anne E; Hartman, Adam L; Haranhalli, Neil

    2013-02-01

    Hemispherectomy surgery for medically intractable epilepsy is known to cause hydrocephalus in a subset of patients. Existing data regarding the incidence of, and risk factors for, developing posthemispherectomy hydrocephalus have been limited by the relatively small number of cases performed by any single center. Our goal was to better understand this phenomenon and to identify risk factors that may predispose patients to developing hydrocephalus after hemispherectomy surgery. Fifteen pediatric epilepsy centers participated in this study. A retrospective chart review was performed on all available patients who had hemispherectomy surgery. Data collected included surgical techniques, etiology of seizures, prior brain surgery, symptoms and signs of hydrocephalus, timing of shunt placement, and basic demographics. Data were collected from 736 patients who underwent hemispherectomy surgery between 1986 and 2011. Forty-six patients had preexisting shunted hydrocephalus and were excluded from analysis, yielding 690 patients for this study. One hundred sixty-two patients (23%) required hydrocephalus treatment. The timing of hydrocephalus ranged from the immediate postoperative period to 8.5 years after surgery, with 43 patients (27%) receiving shunts >90 days after surgery. Multivariate regression analysis revealed anatomic hemispherectomies (odds ratio [OR] 4.1, p < 0.0001) and previous brain surgery (OR 1.7, p = 0.04) as independent significant risk factors for developing hydrocephalus. There was a trend toward significance for the use of hemostatic agents (OR 2.2, p = 0.07) and the involvement of basal ganglia or thalamus in the resection (OR 2.2, p = 0.08) as risk factors. Hydrocephalus is a common sequela of hemispherectomy surgery. Surgical technique and prior brain surgery influence the occurrence of posthemispherectomy hydrocephalus. A significant portion of patients develop hydrocephalus on a delayed basis, indicating the need for long-term surveillance. Wiley Periodicals, Inc. © 2012 International League Against Epilepsy.

  13. Assessing the Relative Risk of Aerocapture Using Probabalistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Percy, Thomas K.; Bright, Ellanee; Torres, Abel O.

    2005-01-01

    A recent study performed for the Aerocapture Technology Area in the In-Space Propulsion Technology Projects Office at the Marshall Space Flight Center investigated the relative risk of various capture techniques for Mars missions. Aerocapture has been proposed as a possible capture technique for future Mars missions but has been perceived by many in the community as a higher risk option as compared to aerobraking and propulsive capture. By performing a probabilistic risk assessment on aerocapture, aerobraking and propulsive capture, a comparison was made to uncover the projected relative risks of these three maneuvers. For mission planners, this knowledge will allow them to decide if the mass savings provided by aerocapture warrant any incremental risk exposure. The study focuses on a Mars Sample Return mission currently under investigation at the Jet Propulsion Laboratory (JPL). In each case (propulsive, aerobraking and aerocapture), the Earth return vehicle is inserted into Martian orbit by one of the three techniques being investigated. A baseline spacecraft was established through initial sizing exercises performed by JPL's Team X. While Team X design results provided the baseline and common thread between the spacecraft, in each case the Team X results were supplemented by historical data as needed. Propulsion, thermal protection, guidance, navigation and control, software, solar arrays, navigation and targeting and atmospheric prediction were investigated. A qualitative assessment of human reliability was also included. Results show that different risk drivers contribute significantly to each capture technique. For aerocapture, the significant drivers include propulsion system failures and atmospheric prediction errors. Software and guidance hardware contribute the most to aerobraking risk. Propulsive capture risk is mainly driven by anomalous solar array degradation and propulsion system failures. While each subsystem contributes differently to the risk of each technique, results show that there exists little relative difference in the reliability of these capture techniques although uncertainty for the aerocapture estimates remains high given the lack of in-space demonstration.

  14. The Negative Impact of Early Peritonitis on Continuous Ambulatory Peritoneal Dialysis Patients

    PubMed Central

    Hsieh, Yao-Peng; Wang, Shu-Chuan; Chang, Chia-Chu; Wen, Yao-Ko; Chiu, Ping-Fang; Yang, Yu

    2014-01-01

    ♦ Background: Peritonitis rate has been reported to be associated with technique failure and overall mortality in previous literatures. However, information on the impact of the timing of the first peritonitis episode on continuous ambulatory peritoneal dialysis (CAPD) patients is sparse. The aim of this research is to study the influence of time to first peritonitis on clinical outcomes, including technique failure, patient mortality and dropout from peritoneal dialysis (PD). ♦ Methods: A retrospective observational cohort study was conducted over 10 years at a single PD unit in Taiwan. A total of 124 patients on CAPD with at least one peritonitis episode comprised the study subjects, which were dichotomized by the median of time to first peritonitis into either early peritonitis patients or late peritonitis patients. Cox proportional hazard model was used to analyze the correlation of the timing of first peritonitis with clinical outcomes. ♦ Results: Early peritonitis patients were older, more diabetic and had lower serum levels of creatinine than the late peritonitis patients. Early peritonitis patients were associated with worse technique survival, patient survival and stay on PD than late peritonitis patients, as indicated by Kaplan-Meier analysis (log-rank test, p = 0.04, p < 0.001, p < 0.001, respectively). In the multivariate Cox regression model, early peritonitis was still a significant predictor for technique failure (hazard ratio (HR), 0.54; 95% confidence interval (CI), 0.30 - 0.98), patient mortality (HR, 0.34; 95% CI, 0.13 - 0.92) and dropout from PD (HR, 0.50; 95% CI, 0.30 - 0.82). In continuous analyses, a 1-month increase in the time to the first peritonitis episode was associated with a 2% decreased risk of technique failure (HR, 0.98; 95% CI, 0.97 - 0.99), a 3% decreased risk of patient mortality (HR, 0.97; 95% CI, 0.95 - 0.99), and a 2% decreased risk of dropout from PD (HR, 98%; 95% CI, 0.97 - 0.99). Peritonitis rate was inversely correlated with time to first peritonitis according to the Spearman analysis (r = -0.64, p < 0.001). ♦ Conclusions: Time to first peritonitis is significantly correlated with clinical outcomes of peritonitis patients with early peritonitis patients having poor prognosis. Patients with shorter time to first peritonitis were prone to having a higher peritonitis rate. PMID:24497590

  15. Combined predictive value of the expanded donor criteria for long-term graft survival of kidneys from donors after cardiac death: A single-center experience over three decades.

    PubMed

    Kusaka, Mamoru; Kubota, Yusuke; Sasaki, Hitomi; Fukami, Naohiko; Fujita, Tamio; Hirose, Yuichi; Takahashi, Hiroshi; Kenmochi, Takashi; Shiroki, Ryoichi; Hoshinaga, Kiyotaka

    2016-04-01

    Kidneys procured from the deceased hold great potential for expanding the donor pool. The aims of the present study were to investigate the post-transplant outcomes of renal allografts recovered from donors after cardiac death, to identify risk factors affecting the renal prognosis and to compare the long-term survival from donors after cardiac death according to the number of risk factors shown by expanded criteria donors. A total of 443 grafts recovered using an in situ regional cooling technique from 1983 to 2011 were assessed. To assess the combined predictive value of the significant expanded criteria donor risk criteria, the patients were divided into three groups: those with no expanded criteria donor risk factors (no risk), one expanded criteria donor risk factor (single-risk) and two or more expanded criteria donor risk factors (multiple-risk). Among the donor factors, age ≥50 years, hypertension, maximum serum creatinine level ≥1.5 mg/dL and a warm ischemia time ≥30 min were identified as independent predictors of long-term graft failure on multivariate analysis. Regarding the expanded criteria donors criteria for marginal donors, cerebrovascular disease, hypertension and maximum serum creatinine level ≥1.5 mg/dL were identified as significant predictors on univariate analysis. The single- and multiple-risk groups showed 2.01- and 2.40-fold higher risks of graft loss, respectively. Renal grafts recovered from donors after cardiac death donors have a good renal function with an excellent long-term graft survival. However, an increased number of expanded criteria donors risk factors increase the risk of graft loss. © 2016 The Japanese Urological Association.

  16. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is considered. Many tasks in computational materials science can be posed as optimization problems. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The last approach is concerned with the generation of realizations of materials with specified but limited microstructural information: an intriguing inverse problem of both fundamental and practical importance. Computational models based upon the theories of molecular dynamics or quantum mechanics would enable the prediction and modification of fundamental materials properties. This problem is solved using deterministic and stochastic optimization techniques. The main optimization approaches in the frame of the EU project "Superlight-weight thermal protection system for space application" are discussed. Optimization approach to the alloys for obtaining materials with required properties using modeling techniques and experimental data will be also considered. This report is supported by the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)"

  17. Assessment of Equipment for the Determination of Nutrients in Marine Waters: A Case Study of the Microplate Technique

    NASA Astrophysics Data System (ADS)

    Aminot, A.

    1996-09-01

    An essential prerequisite for quality assurance of the colorimetric determination of nutrients in seawater is the use of suitable photometric equipment. Based on a knowledge of the optical characteristics of a particular system and the absorption coefficient of the analyte, a statistical approach can be used to predict the limit of detection and the limit of quantitation for a given determinand. The microplate technique, widely used for bioassays, is applicable to colorimetric analysis in general, and its use for the determination of nutrients in seawater has been suggested. This paper reports a theoretical assessment of its capabilities in this context and a practical check on its performance, taking the determination of nitrite in seawater as typical. The conclusion is that short optical path length and insufficient repeatability of the absorbance measurement render it unsuitable for the determination of the low concentrations generally encountered in marine work, with the possible exception of nitrate. The perceived advantage of high-speed analysis is a secondary consideration in the overall process of determining nutrients, and the microplate technique's small scale of operation is a definite disadvantage as this increases the risk of exposure to contamination problems, in comparison with conventional techniques.

  18. Using GIS techniques to detect the impact of territorial evolution on producing natural hazard in Northern Romania, commune Vorniceni

    NASA Astrophysics Data System (ADS)

    Gălbău, Ionela

    2015-04-01

    Using techniques of information, such as Geographic Information Systems (GIS), on spatial analysis, offers numerous possibilities in terms of spatial emphasizing the study area and marking hazard risk areas (especially landslides). Although the means ultra modern techniques have advanced, using GIS in spatial planning remains the most important technique used. Also, GIS maps obtained are more objective than paper made by hand, using the same data and the same conceptual model. The study area, commune Vorniceni is situated in the north of Romania, Ibaneasa River basin, a tributary of Jijiei and occupies an area of 63 km2. The area has experienced over the past 50 years, a trend not only territorial but also morphological and morphometric. This study involves a relation between the evolution of territorial distribution of the population of the commune Vorniceni and influence on the environment. The construction of the dam reservoir Ibaneasa River using poor borrow pits, meant a starting point for the development of landslides. Brutal antropic intervention on the environment by building a dam or lake clogging the two reservoirs (ponds) increased possibility of negative phenomena in the area. These phenomena directly affect the village population as territorial evolution involved the construction of settlements in areas with potential risk of landslides. The analysis of the factors that have influenced the evolution of territorial and producing negative phenomena and making GIS database will be followed by the realization of a hypsometric map of slopes, slope inclination and land use. All this, highlights the relationship anthropic environment - natural environment, and not turning both low population provides another opportunity to use the land in a beneficial way by harnessing the risk map obtained. Although not without shortcomings, the method proved to be a feasible and cost-effective approach for assessing landslide susceptibility and mapping. "ACKNOWLEDGMENT This paper has been financially supported within the project entitled "SOCERT. Knowledge society, dynamism through research", contractnumber POSDRU/159/1.5/S/132406. This project is co-financed by European Social Fund through Sectoral Operational Programme for Human Resources Development 2007-2013. Investing in people!"

  19. System safety in Stirling engine development

    NASA Technical Reports Server (NTRS)

    Bankaitis, H.

    1981-01-01

    The DOE/NASA Stirling Engine Project Office has required that contractors make safety considerations an integral part of all phases of the Stirling engine development program. As an integral part of each engine design subtask, analyses are evolved to determine possible modes of failure. The accepted system safety analysis techniques (Fault Tree, FMEA, Hazards Analysis, etc.) are applied in various degrees of extent at the system, subsystem and component levels. The primary objectives are to identify critical failure areas, to enable removal of susceptibility to such failures or their effects from the system and to minimize risk.

  20. Applying Failure Modes, Effects, And Criticality Analysis And Human Reliability Analysis Techniques To Improve Safety Design Of Work Process In Singapore Armed Forces

    DTIC Science & Technology

    2016-09-01

    an instituted safety program that utilizes a generic risk assessment method involving the 5-M (Mission, Man, Machine , Medium and Management) factor...the Safety core value is hinged upon three key principles—(1) each soldier has a crucial part to play, by adopting safety as a core value and making...it a way of life in his unit; (2) safety is an integral part of training, operations and mission success, and (3) safety is an individual, team and

  1. Risk prioritisation using the analytic hierarchy process

    NASA Astrophysics Data System (ADS)

    Sum, Rabihah Md.

    2015-12-01

    This study demonstrated how to use the Analytic Hierarchy Process (AHP) to prioritise risks of an insurance company. AHP is a technique to structure complex problems by arranging elements of the problems in a hierarchy, assigning numerical values to subjective judgements on the relative importance of the elements and synthesizing the judgements to determine which elements have the highest priority. The study is motivated by wide application of AHP as a prioritisation technique in complex problems. It aims to show AHP is able to minimise some limitations of risk assessment technique using likelihood and impact. The study shows AHP is able to provide consistency check on subjective judgements, organise a large number of risks into a structured framework, assist risk managers to make explicit risk trade-offs, and provide an easy to understand and systematic risk assessment process.

  2. The HSE management system in practice-implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Primrose, M.J.; Bentley, P.D.; Sykes, R.M.

    1996-11-01

    This paper sets out the necessary strategic issues that must be dealt with when setting up a management system for HSE. It touches on the setting of objectives using a form of risk matrix and the establishment of corporate risk tolerability levels. Such issue management is vital but can be seen as yet another corporate HQ initiative. It must therefore be linked, and made relevant to those in middle management tasked with implementing the system and also to those at risk {open_quote}at the sharp end{close_quote} of the business. Setting acceptance criteria is aimed at demonstrating a necessary and sufficient levelmore » of control or coverage for those hazards considered as being within the objective setting of the Safety or HSE Case. Critical risk areas addressed via the Safety Case, within Shell companies at least, must show how this coverage is extended to critical health and environmental issues. Methods of achieving this are various ranging from specific Case deliverables (like the Hazard Register and Accountability Matrices) through to the incorporation of topics from the hazard analysis in toolbox talks and meetings. Risk analysis techniques are increasingly seen as complementary rather than separate with environmental assessments, health risk assessment sand safety risk analyses taking place together and results being considered jointly. The paper ends with some views on the way ahead regarding the linking of risk decisions to target setting at the workplace and views on how Case information may be retrieved and used on a daily basis.« less

  3. Tailoring a Human Reliability Analysis to Your Industry Needs

    NASA Technical Reports Server (NTRS)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.

  4. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    PubMed

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.

  5. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  6. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  7. Microfluidic Devices for Forensic DNA Analysis: A Review

    PubMed Central

    Bruijns, Brigitte; van Asten, Arian; Tiggelaar, Roald; Gardeniers, Han

    2016-01-01

    Microfluidic devices may offer various advantages for forensic DNA analysis, such as reduced risk of contamination, shorter analysis time and direct application at the crime scene. Microfluidic chip technology has already proven to be functional and effective within medical applications, such as for point-of-care use. In the forensic field, one may expect microfluidic technology to become particularly relevant for the analysis of biological traces containing human DNA. This would require a number of consecutive steps, including sample work up, DNA amplification and detection, as well as secure storage of the sample. This article provides an extensive overview of microfluidic devices for cell lysis, DNA extraction and purification, DNA amplification and detection and analysis techniques for DNA. Topics to be discussed are polymerase chain reaction (PCR) on-chip, digital PCR (dPCR), isothermal amplification on-chip, chip materials, integrated devices and commercially available techniques. A critical overview of the opportunities and challenges of the use of chips is discussed, and developments made in forensic DNA analysis over the past 10–20 years with microfluidic systems are described. Areas in which further research is needed are indicated in a future outlook. PMID:27527231

  8. Risk Factors For Stroke, Myocardial Infarction, or Death Following Carotid Endarterectomy: Results From the International Carotid Stenting Study.

    PubMed

    Doig, D; Turner, E L; Dobson, J; Featherstone, R L; de Borst, G J; Stansby, G; Beard, J D; Engelter, S T; Richards, T; Brown, M M

    2015-12-01

    Carotid endarterectomy (CEA) is standard treatment for symptomatic carotid artery stenosis but carries a risk of stroke, myocardial infarction (MI), or death. This study investigated risk factors for these procedural complications occurring within 30 days of endarterectomy in the International Carotid Stenting Study (ICSS). Patients with recently symptomatic carotid stenosis >50% were randomly allocated to endarterectomy or stenting. Analysis is reported of patients in ICSS assigned to endarterectomy and limited to those in whom CEA was initiated. The occurrence of stroke, MI, or death within 30 days of the procedure was reported by investigators and adjudicated. Demographic and technical risk factors for these complications were analysed sequentially in a binomial regression analysis and subsequently in a multivariable model. Eight-hundred and twenty-one patients were included in the analysis. The risk of stroke, MI, or death within 30 days of CEA was 4.0%. The risk was higher in female patients (risk ratio [RR] 1.98, 95% CI 1.02-3.87, p = .05) and with increasing baseline diastolic blood pressure (dBP) (RR 1.30 per +10 mmHg, 95% CI 1.02-1.66, p = .04). Mean baseline dBP, obtained at the time of randomization in the trial, was 78 mmHg (SD 13 mmHg). In a multivariable model, only dBP remained a significant predictor. The risk was not related to the type of surgical reconstruction, anaesthetic technique, or perioperative medication regimen. Patients undergoing CEA stayed a median of 4 days before discharge, and 21.2% of events occurred on or after the day of discharge. Increasing diastolic blood pressure was the only independent risk factor for stroke, MI, or death following CEA. Cautious attention to blood pressure control following symptoms attributable to carotid stenosis could reduce the risks associated with subsequent CEA. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  10. The occurrence of zoonotic parasites in rural dog populations from northern Portugal.

    PubMed

    Cardoso, A S; Costa, I M H; Figueiredo, C; Castro, A; Conceição, M A P

    2014-06-01

    A survey of intestinal parasites in dogs was carried out in a rural region around Cantanhede, in northern Portugal, where 301 dog faecal samples were collected from small-ruminant farms. Saturated salt flotation and formol-ether sedimentation techniques were used. An enquiry was conducted in 234 farms and a risk factor evaluation for zoonotic helminths was determined among the 195 farmers who owned dogs. The overall parasite prevalence in faecal samples of dogs was 58.8%, with specific prevalences for Ancylostomidae being 40.9% followed by species of Trichuris (29.9%), Toxocara (8%), Isospora (4%), Capillaria (0.7%) and Spirometra (0.3%). Taeniidae eggs were present in five samples (1.7%) which were analysed with the polymerase chain reaction (PCR) technique and revealed to be from Taenia sp., and not Echinococcus granulosus. This rural region has a traditional small-farm system, in which farm products are mainly for in-house consumption and home slaughtering is a current practice (57%). Analysis showed home slaughtering to be a statistically significant risk factor for the presence of Ancylostomidae (P= 0.007) and Toxocara sp. (P= 0.049). Owning cattle was found to be a significant risk factor for Taenia sp. (P= 0.031).

  11. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    PubMed

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  12. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  13. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  14. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  15. Spatio-temporal analysis of the relationship between WNV dissemination and environmental variables in Indianapolis, USA.

    PubMed

    Liu, Hua; Weng, Qihao; Gaines, David

    2008-12-18

    This study developed a multi-temporal analysis on the relationship between West Nile Virus (WNV) dissemination and environmental variables by using an integrated approach of remote sensing, GIS, and statistical techniques. WNV mosquito cases in seven months (April-October) of the six years (2002-2007) were collected in Indianapolis, USA. Epidemic curves were plotted to identify the temporal outbreaks of WNV. Spatial-temporal analysis and k-mean cluster analysis were further applied to determine the high-risk areas. Finally, the relationship between environmental variables and WNV outbreaks were examined by using Discriminant Analysis. The results show that the WNV epidemic curve reached its peak in August for all years in the study area except in 2007, where the peak was reached in July. WNV dissemination started from the central longitudinal corridor of the city and spread out to the east and west. Different years and seasons had different high-risk areas, but the southwest and southeast corners show the highest risk for WNV infection due to their high percentages of agriculture and water sources. Major environmental factors contributing to the outbreak of WNV in Indianapolis were the percentages of agriculture and water, total length of streams, and total size of wetlands. This study provides important information for urban public health prevention and management. It also contributes to the optimization of mosquito control and arrangement of future sampling efforts.

  16. A cross-cultural study of perceived benefit versus risk as mediators in the trust-acceptance relationship.

    PubMed

    Bronfman, Nicolás C; Vázquez, Esperanza López

    2011-12-01

    Several recent studies have identified the significant role social trust in regulatory organizations plays in the public acceptance of various technologies and activities. In a cross-cultural investigation, the current work explores empirically the relationship between social trust in management authorities and the degree of public acceptability of hazards for individuals residing in either developed or emerging Latin American economies using confirmatory rather than exploratory techniques. Undergraduates in Mexico, Brazil, and Chile and the United States and Spain assessed trust in regulatory authorities, public acceptance, personal knowledge, and the risks and benefits for 23 activities and technological hazards. Four findings were encountered. (i) In Latin American nations trust in regulatory entities was strongly and significantly (directly as well as indirectly) linked with the public's acceptance of any activity or technology. In developed countries trust and acceptability are essentially linked indirectly (through perceived risk and perceived benefit). (ii) Lack of knowledge strengthened the magnitude and statistical significance of the trust-acceptability relationship in both developed and developing countries. (iii) For high levels of claimed knowledge, the impact on the trust-acceptability relationship varied depending upon the origin of the sample. (iv) Confirmatory analysis revealed the relative importance of perceived benefit over perceived risk in meditating the trust-acceptability causal chain. © 2011 Society for Risk Analysis.

  17. Rapid-prenatal diagnosis through fluorescence in situ hybridization for preventing aneuploidy related birth defects

    PubMed Central

    Fauzdar, Ashish; Chowdhry, Mohit; Makroo, R. N.; Mishra, Manoj; Srivastava, Priyanka; Tyagi, Richa; Bhadauria, Preeti; Kaul, Anita

    2013-01-01

    BACKGROUND AND OBJECTIVE: Women with high-risk pregnancies are offered prenatal diagnosis through amniocentesis for cytogenetic analysis of fetal cells. The aim of this study was to evaluate the effectiveness of the rapid fluorescence in situ hybridization (FISH) technique for detecting numerical aberrations of chromosomes 13, 21, 18, X and Y in high-risk pregnancies in an Indian scenario. MATERIALS AND METHODS: A total of 163 samples were received for a FISH and/or a full karyotype for prenatal diagnosis from high-risk pregnancies. In 116 samples both conventional culture techniques for getting karyotype through G-banding techniques were applied in conjunction to FISH test using the AneuVysion kit (Abbott Molecular, Inc.), following standard recommended protocol to compare the both the techniques in our setup. RESULTS: Out of 116 patients, we got 96 normal for the five major chromosome abnormality and seven patients were found to be abnormal (04 trisomy 21, 02 monosomy X, and 01 trisomy 13) and all the FISH results correlated with conventional cytogenetics. To summarize the results of total 163 patients for the major chromosomal abnormalities analyzed by both/or cytogenetics and FISH there were 140 (86%) normal, 9 (6%) cases were abnormal and another 4 (2.5%) cases were suspicious mosaic and 10 (6%) cases of culture failure. The diagnostic detection rate with FISH in 116 patients was 97.5%. There were no false-positive and false-negative autosomal or sex chromosomal results, within our established criteria for reporting FISH signals. CONCLUSION: Rapid FISH is a reliable and prompt method for detecting numerical chromosomal aberrations and has now been implemented as a routine diagnostic procedure for detection of fetal aneuploidy in India. PMID:23901191

  18. Rapid-prenatal diagnosis through fluorescence in situ hybridization for preventing aneuploidy related birth defects.

    PubMed

    Fauzdar, Ashish; Chowdhry, Mohit; Makroo, R N; Mishra, Manoj; Srivastava, Priyanka; Tyagi, Richa; Bhadauria, Preeti; Kaul, Anita

    2013-01-01

    Women with high-risk pregnancies are offered prenatal diagnosis through amniocentesis for cytogenetic analysis of fetal cells. The aim of this study was to evaluate the effectiveness of the rapid fluorescence in situ hybridization (FISH) technique for detecting numerical aberrations of chromosomes 13, 21, 18, X and Y in high-risk pregnancies in an Indian scenario. A total of 163 samples were received for a FISH and/or a full karyotype for prenatal diagnosis from high-risk pregnancies. In 116 samples both conventional culture techniques for getting karyotype through G-banding techniques were applied in conjunction to FISH test using the AneuVysion kit (Abbott Molecular, Inc.), following standard recommended protocol to compare the both the techniques in our setup. Out of 116 patients, we got 96 normal for the five major chromosome abnormality and seven patients were found to be abnormal (04 trisomy 21, 02 monosomy X, and 01 trisomy 13) and all the FISH results correlated with conventional cytogenetics. To summarize the results of total 163 patients for the major chromosomal abnormalities analyzed by both/or cytogenetics and FISH there were 140 (86%) normal, 9 (6%) cases were abnormal and another 4 (2.5%) cases were suspicious mosaic and 10 (6%) cases of culture failure. The diagnostic detection rate with FISH in 116 patients was 97.5%. There were no false-positive and false-negative autosomal or sex chromosomal results, within our established criteria for reporting FISH signals. Rapid FISH is a reliable and prompt method for detecting numerical chromosomal aberrations and has now been implemented as a routine diagnostic procedure for detection of fetal aneuploidy in India.

  19. Incidence of major hemorrhage after aggressive image-guided liver mass biopsy in the era of individualized medicine.

    PubMed

    Boyum, James H; Atwell, Thomas D; Wall, Darci J; Mansfield, Aaron S; Kerr, Sarah E; Gunderson, Tina M; Rumilla, Kandelaria M; Weisbrod, Adam J; Kurup, A Nicholas

    2018-05-17

    To analyze a large volume of image-guided liver mass biopsies to assess for an increased incidence of major hemorrhage after aggressive liver mass sampling, and to determine if coaxial technique reduces major hemorrhage rate. Patients who underwent image-guided liver mass biopsy over a 15-year period (December 7, 2001-September 22, 2016) were retrospectively identified. An aggressive biopsy was defined as a biopsy event in which ≥ 4 core needle passes were performed. Association of major hemorrhage after aggressive liver mass biopsy and other potential risk factors of interest were assessed using logistic regression analysis. For the subset of aggressive biopsies, Fisher's exact test was used to compare the incidence of major hemorrhage using coaxial versus noncoaxial techniques. Aggressive biopsies constituted 11.6% of biopsy events (N =579/5011). The incidence of major hemorrhage with <4 passes was 0.4% (N =18/4432) and with ≥4 passes 1.2% (N =6/579). In univariable models, aggressive biopsy was significantly associated with major hemorrhage (OR 3.0, 95% CI 1.16-6.92, p =0.025). After adjusting for gender and platelet count, the association was not significant at the p =0.05 level (OR 2.58, 95% CI 0.927-6.24, p =0.067). The rate of major hemorrhage in the coaxial biopsy technique group was 1.4% (N =3/209) compared to 1.1% (N =4/370) in the noncoaxial biopsy technique group, which was not a significant difference (p =0.707). Although aggressive image-guided liver mass biopsies had an increased incidence of major hemorrhage, the overall risk of bleeding remained low. The benefit of such biopsies will almost certainly outweigh the risk in most patients.

  20. Recommendations for benefit-risk assessment methodologies and visual representations.

    PubMed

    Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain

    2016-03-01

    The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Balancing benefit and risk of medicines: a systematic review and classification of available methodologies.

    PubMed

    Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah

    2014-07-01

    The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Risk Predictors and Causes of Technique Failure Within the First Year of Peritoneal Dialysis: An Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) Study.

    PubMed

    See, Emily J; Johnson, David W; Hawley, Carmel M; Pascoe, Elaine M; Badve, Sunil V; Boudville, Neil; Clayton, Philip A; Sud, Kamal; Polkinghorne, Kevan R; Borlace, Monique; Cho, Yeoungjee

    2017-12-22

    Concern regarding technique failure is a major barrier to increased uptake of peritoneal dialysis (PD), and the first year of therapy is a particularly vulnerable time. A cohort study using competing-risk regression analyses to identify the key risk factors and risk periods for early transfer to hemodialysis therapy or death in incident PD patients. All adult patients who initiated PD therapy in Australia and New Zealand in 2000 through 2014. Patient demographics and comorbid conditions, duration of prior renal replacement therapy, timing of referral, PD modality, dialysis era, and center size. Technique failure within the first year, defined as transfer to hemodialysis therapy for more than 30 days or death. Of 16,748 patients included in the study, 4,389 developed early technique failure. Factors associated with increased risk included age older than 70 years, diabetes or vascular disease, prior renal replacement therapy, late referral to a nephrology service, or management in a smaller center. Asian or other race and use of continuous ambulatory PD were associated with reduced risk, as was initiation of PD therapy in 2010 through 2014. Although the risk for technique failure due to death or infection was constant during the first year, mechanical and other causes accounted for a greater number of cases within the initial 9 months of treatment. Potential for residual confounding due to limited data for residual kidney function, dialysis prescription, and socioeconomic factors. Several modifiable and nonmodifiable factors are associated with early technique failure in PD. Targeted interventions should be considered in high-risk patients to avoid the consequences of an unplanned transfer to hemodialysis therapy or death. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  3. A combined field/remote sensing approach for characterizing landslide risk in coastal areas

    NASA Astrophysics Data System (ADS)

    Francioni, Mirko; Coggan, John; Eyre, Matthew; Stead, Doug

    2018-05-01

    Understanding the key factors controlling slope failure mechanisms in coastal areas is the first and most important step for analyzing, reconstructing and predicting the scale, location and extent of future instability in rocky coastlines. Different failure mechanisms may be possible depending on the influence of the engineering properties of the rock mass (including the fracture network), the persistence and type of discontinuity and the relative aspect or orientation of the coastline. Using a section of the North Coast of Cornwall, UK, as an example we present a multi-disciplinary approach for characterizing landslide risk associated with coastal instabilities in a blocky rock mass. Remotely captured terrestrial and aerial LiDAR and photogrammetric data were interrogated using Geographic Information System (GIS) techniques to provide a framework for subsequent analysis, interpretation and validation. The remote sensing mapping data was used to define the rock mass discontinuity network of the area and to differentiate between major and minor geological structures controlling the evolution of the North Coast of Cornwall. Kinematic instability maps generated from aerial LiDAR data using GIS techniques and results from structural and engineering geological surveys are presented. With this method, it was possible to highlight the types of kinematic failure mechanism that may generate coastal landslides and highlight areas that are more susceptible to instability or increased risk of future instability. Multi-temporal aerial LiDAR data and orthophotos were also studied using GIS techniques to locate recent landslide failures, validate the results obtained from the kinematic instability maps through site observations and provide improved understanding of the factors controlling the coastal geomorphology. The approach adopted is not only useful for academic research, but also for local authorities and consultancy's when assessing the likely risks of coastal instability.

  4. PREVENtion of HeartMate II Pump Thrombosis Through Clinical Management: The PREVENT multi-center study.

    PubMed

    Maltais, Simon; Kilic, Ahmet; Nathan, Sriram; Keebler, Mary; Emani, Sitaramesh; Ransom, John; Katz, Jason N; Sheridan, Brett; Brieke, Andreas; Egnaczyk, Gregory; Entwistle, John W; Adamson, Robert; Stulak, John; Uriel, Nir; O'Connell, John B; Farrar, David J; Sundareswaran, Kartik S; Gregoric, Igor

    2017-01-01

    Recommended structured clinical practices including implant technique, anti-coagulation strategy, and pump speed management (PREVENT [PREVENtion of HeartMate II Pump Thrombosis Through Clinical Management] recommendations) were developed to address risk of early (<3 months) pump thrombosis (PT) risk with HeartMate II (HMII; St. Jude Medical, Inc. [Thoratec Corporation], Pleasanton, CA). We prospectively assessed the HMII PT rate in the current era when participating centers adhered to the PREVENT recommendations. PREVENT was a prospective, multi-center, single-arm, non-randomized study of 300 patients implanted with HMII at 24 participating sites. Confirmed PT (any suspected PT confirmed visually and/or adjudicated by an independent assessor) was evaluated at 3 months (primary end-point) and at 6 months after implantation. The population included 83% men (age 57 years ± 13), 78% destination therapy, and 83% Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) Profile 1-3. Primary end-point analysis showed a confirmed PT of 2.9% at 3 months and 4.8% at 6 months. Adherence to key recommendations included 78% to surgical recommendations, 95% to heparin bridging, and 79% to pump speeds ≥9,000 RPMs (92% >8,600 RPMs). Full adherence to implant techniques, heparin bridging, and pump speeds ≥9,000 RPMs resulted in a significantly lower risk of PT (1.9% vs 8.9%; p < 0.01) and lower composite risk of suspected thrombosis, hemolysis, and ischemic stroke (5.7% vs 17.7%; p < 0.01) at 6 months. Adoption of all components of a structured surgical implant technique and clinical management strategy (PREVENT recommendations) is associated with low rates of confirmed PT. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. The evolving field of prognostication and risk stratification in MDS: Recent developments and future directions.

    PubMed

    Lee, Eun-Ju; Podoltsev, Nikolai; Gore, Steven D; Zeidan, Amer M

    2016-01-01

    The clinical course of patients with myelodysplastic syndromes (MDS) is characterized by wide variability reflecting the underlying genetic and biological heterogeneity of the disease. Accurate prediction of outcomes for individual patients is an integral part of the evidence-based risk/benefit calculations that are necessary for tailoring the aggressiveness of therapeutic interventions. While several prognostication tools have been developed and validated for risk stratification, each of these systems has limitations. The recent progress in genomic sequencing techniques has led to discoveries of recurrent molecular mutations in MDS patients with independent impact on relevant clinical outcomes. Reliable assays of these mutations have already entered the clinic and efforts are currently ongoing to formally incorporate mutational analysis into the existing clinicopathologic risk stratification tools. Additionally, mutational analysis holds promise for going beyond prognostication to therapeutic selection and individualized treatment-specific prediction of outcomes; abilities that would revolutionize MDS patient care. Despite these exciting developments, the best way of incorporating molecular testing for use in prognostication and prediction of outcomes in clinical practice remains undefined and further research is warranted. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Conscious worst case definition for risk assessment, part I: a knowledge mapping approach for defining most critical risk factors in integrative risk management of chemicals and nanomaterials.

    PubMed

    Sørensen, Peter B; Thomsen, Marianne; Assmuth, Timo; Grieger, Khara D; Baun, Anders

    2010-08-15

    This paper helps bridge the gap between scientists and other stakeholders in the areas of human and environmental risk management of chemicals and engineered nanomaterials. This connection is needed due to the evolution of stakeholder awareness and scientific progress related to human and environmental health which involves complex methodological demands on risk management. At the same time, the available scientific knowledge is also becoming more scattered across multiple scientific disciplines. Hence, the understanding of potentially risky situations is increasingly multifaceted, which again challenges risk assessors in terms of giving the 'right' relative priority to the multitude of contributing risk factors. A critical issue is therefore to develop procedures that can identify and evaluate worst case risk conditions which may be input to risk level predictions. Therefore, this paper suggests a conceptual modelling procedure that is able to define appropriate worst case conditions in complex risk management. The result of the analysis is an assembly of system models, denoted the Worst Case Definition (WCD) model, to set up and evaluate the conditions of multi-dimensional risk identification and risk quantification. The model can help optimize risk assessment planning by initial screening level analyses and guiding quantitative assessment in relation to knowledge needs for better decision support concerning environmental and human health protection or risk reduction. The WCD model facilitates the evaluation of fundamental uncertainty using knowledge mapping principles and techniques in a way that can improve a complete uncertainty analysis. Ultimately, the WCD is applicable for describing risk contributing factors in relation to many different types of risk management problems since it transparently and effectively handles assumptions and definitions and allows the integration of different forms of knowledge, thereby supporting the inclusion of multifaceted risk components in cumulative risk management. Copyright 2009 Elsevier B.V. All rights reserved.

  7. Identifying desertification risk areas using fuzzy membership and geospatial technique - A case study, Kota District, Rajasthan

    NASA Astrophysics Data System (ADS)

    Dasgupta, Arunima; Sastry, K. L. N.; Dhinwa, P. S.; Rathore, V. S.; Nathawat, M. S.

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rule-based inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean μ and standard deviation σ, the values falling beyond μ + 2 σ and μ - 2 σ are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  8. Population viability analysis for endangered Roanoke logperch

    USGS Publications Warehouse

    Roberts, James H.; Angermeier, Paul; Anderson, Gregory B.

    2016-01-01

    A common strategy for recovering endangered species is ensuring that populations exceed the minimum viable population size (MVP), a demographic benchmark that theoretically ensures low long-term extinction risk. One method of establishing MVP is population viability analysis, a modeling technique that simulates population trajectories and forecasts extinction risk based on a series of biological, environmental, and management assumptions. Such models also help identify key uncertainties that have a large influence on extinction risk. We used stochastic count-based simulation models to explore extinction risk, MVP, and the possible benefits of alternative management strategies in populations of Roanoke logperch Percina rex, an endangered stream fish. Estimates of extinction risk were sensitive to the assumed population growth rate and model type, carrying capacity, and catastrophe regime (frequency and severity of anthropogenic fish kills), whereas demographic augmentation did little to reduce extinction risk. Under density-dependent growth, the estimated MVP for Roanoke logperch ranged from 200 to 4200 individuals, depending on the assumed severity of catastrophes. Thus, depending on the MVP threshold, anywhere from two to all five of the logperch populations we assessed were projected to be viable. Despite this uncertainty, these results help identify populations with the greatest relative extinction risk, as well as management strategies that might reduce this risk the most, such as increasing carrying capacity and reducing fish kills. Better estimates of population growth parameters and catastrophe regimes would facilitate the refinement of MVP and extinction-risk estimates, and they should be a high priority for future research on Roanoke logperch and other imperiled stream-fish species.

  9. [Methods of cholesterol determination: conventional procedure or "dry chemistry"?].

    PubMed

    Riesen, W; Keller, H

    1990-06-01

    The search for the cardiovascular risk factor cholesterol should essentially be done in the physicians' laboratory. The majority of such analyses is performed by 'dry' chemistry tests. This review compares this technique with conventional methods for the determination of cholesterol. The reagents and the reaction mechanisms are principally the same for both techniques, i.e. fully enzymatic methods are used. In 'dry' chemistry the reagents are fixed on a solid carrier. The reactive state is provided by the liquid of the specimen. Two principles are employed: the technique of strips which is already utilised in urinary analysis and the system of multiple film layers as it is common in color-film technique. Three already introduced systems are discussed: the Seralyzer (Ames), the Ektachem (Kodak), and the Reflotron (Boehringer, Mannheim), and one system which is still in evaluation (the Clinistat, Ames). All the systems give a good agreement provided that they are operated by well-trained operators. Problems arise with quality control, since matrix effects are particularly important. The exactitude of the results depends on the calibration. Both, the Reflotron and the Clinistat are calibrated by the manufactories himself, the employer has no influence and is entirely dependent on the reliability of the producer. Although clinical chemistry analyses are facilitated by 'dry' chemistry it is by no means devoid of risks because the errors are more difficult to recognize.

  10. Misplacement of left-sided double-lumen tubes into the right mainstem bronchus: incidence, risk factors and blind repositioning techniques.

    PubMed

    Seo, Jeong-Hwa; Bae, Jun-Yeol; Kim, Hyun Joo; Hong, Deok Man; Jeon, Yunseok; Bahk, Jae-Hyon

    2015-10-28

    Double-lumen endobronchial tubes (DLTs) are commonly advanced into the mainstem bronchus either blindly or by fiberoptic bronchoscopic guidance. However, blind advancement may result in misplacement of left-sided DLTs into the right bronchus. Therefore, incidence, risk factors, and blind repositioning techniques for right bronchial misplacement of left-sided DLTs were investigated. This was an observational cohort study performed on the data depository consecutively collected from patients who underwent intubation of left-sided DLTs for 2 years. Patients' clinical and anatomical characteristics were analyzed to investigate risk factors for DLT misplacements with logistic regression analysis. Moreover, when DLTs were misplaced into the right bronchus, the bronchial tube was withdrawn into the trachea and blindly readvanced without rotation, or with 90° or 180° counterclockwise rotation while the patient's head was turned right. DLTs were inadvertently advanced into the right bronchus in 48 of 1135 (4.2 %) patients. DLT misplacements occurred more frequently in females, in patients of short stature or with narrow trachea and bronchi, and when small-sized DLTs were used. All of these factors were significantly inter-correlated each other (P < 0.001). In 40 of the 48 (83.3 %) patients, blind repositioning was successful. Smaller left-sided DLTs were more frequently misplaced into the right mainstem bronchus than larger DLTs. Moreover, we were usually able to reposition the misplaced DLTs into the left bronchus by using the blind techniques. ClinicalTrials.gov Identifier: NCT01371773.

  11. A Study on Re-entry Predictions of Uncontrolled Space Objects for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Choi, Eun-Jung; Cho, Sungki; Lee, Deok-Jin; Kim, Siwoo; Jo, Jung Hyun

    2017-12-01

    The key risk analysis technologies for the re-entry of space objects into Earth’s atmosphere are divided into four categories: cataloguing and databases of the re-entry of space objects, lifetime and re-entry trajectory predictions, break-up models after re-entry and multiple debris distribution predictions, and ground impact probability models. In this study, we focused on re- entry prediction, including orbital lifetime assessments, for space situational awareness systems. Re-entry predictions are very difficult and are affected by various sources of uncertainty. In particular, during uncontrolled re-entry, large spacecraft may break into several pieces of debris, and the surviving fragments can be a significant hazard for persons and properties on the ground. In recent years, specific methods and procedures have been developed to provide clear information for predicting and analyzing the re-entry of space objects and for ground-risk assessments. Representative tools include object reentry survival analysis tool (ORSAT) and debris assessment software (DAS) developed by National Aeronautics and Space Administration (NASA), spacecraft atmospheric re-entry and aerothermal break-up (SCARAB) and debris risk assessment and mitigation analysis (DRAMA) developed by European Space Agency (ESA), and semi-analytic tool for end of life analysis (STELA) developed by Centre National d’Etudes Spatiales (CNES). In this study, various surveys of existing re-entry space objects are reviewed, and an efficient re-entry prediction technique is suggested based on STELA, the life-cycle analysis tool for satellites, and DRAMA, a re-entry analysis tool. To verify the proposed method, the re-entry of the Tiangong-1 Space Lab, which is expected to re-enter Earth’s atmosphere shortly, was simulated. Eventually, these results will provide a basis for space situational awareness risk analyses of the re-entry of space objects.

  12. Network analysis of a financial market based on genuine correlation and threshold method

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Shirazi, A. H.; Raei, R.; Jafari, G. R.

    2011-10-01

    A financial market is an example of an adaptive complex network consisting of many interacting units. This network reflects market’s behavior. In this paper, we use Random Matrix Theory (RMT) notion for specifying the largest eigenvector of correlation matrix as the market mode of stock network. For a better risk management, we clean the correlation matrix by removing the market mode from data and then construct this matrix based on the residuals. We show that this technique has an important effect on correlation coefficient distribution by applying it for Dow Jones Industrial Average (DJIA). To study the topological structure of a network we apply the removing market mode technique and the threshold method to Tehran Stock Exchange (TSE) as an example. We show that this network follows a power-law model in certain intervals. We also show the behavior of clustering coefficients and component numbers of this network for different thresholds. These outputs are useful for both theoretical and practical purposes such as asset allocation and risk management.

  13. A method for scenario-based risk assessment for robust aerospace systems

    NASA Astrophysics Data System (ADS)

    Thomas, Victoria Katherine

    In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.

  14. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient. PMID:27370140

  15. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M. Saiful, E-mail: HUQS@UPMC.EDU

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact ofmore » possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.« less

  16. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.

  17. Closure methods for laparotomy incisions for preventing incisional hernias and other wound complications.

    PubMed

    Patel, Sunil V; Paskar, David D; Nelson, Richard L; Vedula, Satyanarayana S; Steele, Scott R

    2017-11-03

    Surgeons who perform laparotomy have a number of decisions to make regarding abdominal closure. Material and size of potential suture types varies widely. In addition, surgeons can choose to close the incision in anatomic layers or mass ('en masse'), as well as using either a continuous or interrupted suturing technique, of which there are different styles of each. There is ongoing debate as to which suturing techniques and suture materials are best for achieving definitive wound closure while minimising the risk of short- and long-term complications. The objectives of this review were to identify the best available suture techniques and suture materials for closure of the fascia following laparotomy incisions, by assessing the following comparisons: absorbable versus non-absorbable sutures; mass versus layered closure; continuous versus interrupted closure techniques; monofilament versus multifilament sutures; and slow absorbable versus fast absorbable sutures. Our objective was not to determine the single best combination of suture material and techniques, but to compare the individual components of abdominal closure. On 8 February 2017 we searched CENTRAL, MEDLINE, Embase, two trials registries, and Science Citation Index. There were no limitations based on language or date of publication. We searched the reference lists of all included studies to identify trials that our searches may have missed. We included randomised controlled trials (RCTs) that compared suture materials or closure techniques, or both, for fascial closure of laparotomy incisions. We excluded trials that compared only types of skin closures, peritoneal closures or use of retention sutures. We abstracted data and assessed the risk of bias for each trial. We calculated a summary risk ratio (RR) for the outcomes assessed in the review, all of which were dichotomous. We used random-effects modelling, based on the heterogeneity seen throughout the studies and analyses. We completed subgroup analysis planned a priori for each outcome, excluding studies where interventions being compared differed by more than one component, making it impossible to determine which variable impacted on the outcome, or the possibility of a synergistic effect. We completed sensitivity analysis, excluding trials with at least one trait with high risk of bias. We assessed the quality of evidence using the GRADEpro guidelines. Fifty-five RCTs with a total of 19,174 participants met the inclusion criteria and were included in the meta-analysis. Included studies were heterogeneous in the type of sutures used, methods of closure and patient population. Many of the included studies reported multiple comparisons.For our primary outcome, the proportion of participants who developed incisional hernia at one year or more of follow-up, we did not find evidence that suture absorption (absorbable versus non-absorbable sutures, RR 1.07, 95% CI 0.86 to 1.32, moderate-quality evidence; or slow versus fast absorbable sutures, RR 0.81, 95% CI 0.63 to 1.06, moderate-quality evidence), closure method (mass versus layered, RR 1.92, 95% CI 0.58 to 6.35, very low-quality evidence) or closure technique (continuous versus interrupted, RR 1.01, 95% CI 0.76 to 1.35, moderate-quality evidence) resulted in a difference in the risk of incisional hernia. We did, however, find evidence to suggest that monofilament sutures reduced the risk of incisional hernia when compared with multifilament sutures (RR 0.76, 95% CI 0.59 to 0.98, I 2 = 30%, moderate-quality evidence).For our secondary outcomes, we found that none of the interventions reduced the risk of wound infection, whether based on suture absorption (absorbable versus non-absorbable sutures, RR 0.99, 95% CI 0.84 to 1.17, moderate-quality evidence; or slow versus fast absorbable sutures, RR 1.16, 95% CI 0.85 to 1.57, moderate-quality evidence), closure method (mass versus layered, RR 0.93, 95% CI 0.67 to 1.30, low-quality evidence) or closure technique (continuous versus interrupted, RR 1.13, 95% CI 0.96 to 1.34, moderate-quality evidence).Similarily, none of the interventions reduced the risk of wound dehiscence whether based on suture absorption (absorbable versus non-absorbable sutures, RR 0.78, 95% CI 0.55 to 1.10, moderate-quality evidence; or slow versus fast absorbable sutures, RR 1.55, 95% CI 0.92 to 2.61, moderate-quality evidence), closure method (mass versus layered, RR 0.69, 95% CI 0.31 to 1.52, moderate-quality evidence) or closure technique (continuous versus interrupted, RR 1.21, 95% CI 0.90 to 1.64, moderate-quality evidence).Absorbable sutures, compared with non-absorbable sutures (RR 0.49, 95% CI 0.26 to 0.94, low-quality evidence) reduced the risk of sinus or fistula tract formation. None of the other comparisons showed a difference (slow versus fast absorbable sutures, RR 0.88, 95% CI 0.05 to 16.05, very low-quality evidence; mass versus layered, RR 0.49, 95% CI 0.15 to 1.62, low-quality evidence; continuous versus interrupted, RR 1.51, 95% CI 0.64 to 3.61, very low-quality evidence). Based on this moderate-quality body of evidence, monofilament sutures may reduce the risk of incisional hernia. Absorbable sutures may also reduce the risk of sinus or fistula tract formation, but this finding is based on low-quality evidence.We had serious concerns about the design or reporting of several of the 55 included trials. The comparator arms in many trials differed by more than one component, making it impossible to attribute differences between groups to any one component. In addition, the patient population included in many of the studies was very heterogeneous. Trials included both emergency and elective cases, different types of disease pathology (e.g. colon surgery, hepatobiliary surgery, etc.) or different types of incisions (e.g. midline, paramedian, subcostal).Consequently, larger, high-quality trials to further address this clinical challenge are warranted. Future studies should ensure that proper randomisation and allocation techniques are performed, wound assessors are blinded, and that the duration of follow-up is adequate. It is important that only one type of intervention is compared between groups. In addition, a homogeneous patient population would allow for a more accurate assessment of the interventions.

  18. Dialyzer Reuse with Peracetic Acid Does Not Impact Patient Mortality

    PubMed Central

    Bond, T. Christopher; Krishnan, Mahesh; Wilson, Steven M.; Mayne, Tracy

    2011-01-01

    Summary Background and objectives Numerous studies have shown the overall benefits of dialysis filter reuse, including superior biocompatibility and decreased nonbiodegradable medical waste generation, without increased risk of mortality. A recent study reported that dialyzer reprocessing was associated with decreased patient survival; however, it did not control for sources of potential confounding. We sought to determine the effect of dialyzer reprocessing with peracetic acid on patient mortality using contemporary outcomes data and rigorous analytical techniques. Design, setting, participants, & measurements We conducted a series of analyses of hemodialysis patients examining the effects of reuse on mortality using three techniques to control for potential confounding: instrumental variables, propensity-score matching, and time-dependent survival analysis. Results In the instrumental variables analysis, patients at high reuse centers had 16.2 versus 15.9 deaths/100 patient-years in nonreuse centers. In the propensity-score matched analysis, patients with reuse had a lower death rate per 100 patient-years than those without reuse (15.2 versus 15.5). The risk ratios for the time-dependent survival analyses were 0.993 (per percent of sessions with reuse) and 0.995 (per unit of last reuse), respectively. Over the study period, 13.8 million dialyzers were saved, representing 10,000 metric tons of medical waste. Conclusions Despite the large sample size, powered to detect miniscule effects, neither the instrumental variables nor propensity-matched analyses were statistically significant. The time-dependent survival analysis showed a protective effect of reuse. These data are consistent with the preponderance of evidence showing reuse limits medical waste generation without negatively affecting clinical outcomes. PMID:21566107

  19. How to freak a Black & Mild: a multi-study analysis of YouTube videos illustrating cigar product modification.

    PubMed

    Nasim, Aashir; Blank, Melissa D; Cobb, Caroline O; Berry, Brittany M; Kennedy, May G; Eissenberg, Thomas

    2014-02-01

    Cigar smoking is increasingly common among adolescents who perceive cigars as less harmful than cigarettes. This perception of reduced harm is especially true for cigars that are user-modified by removing the tobacco binder through a process called 'freaking'. Little is known about 'freaking' and this multi-study, mixed-methods analysis sought to understand better the rationale and prevailing beliefs about this smoking practice using YouTube videos. In Study 1, we conducted a descriptive content analysis on the characteristics of 26 randomly sampled cigar product modification (CPM) videos posted during 2006-10. In Study 2, a thematic analysis was performed on the transcripts of commentary associated with each video to characterize viewers' comments about video content. Study 1 results revealed that 90% of videos illustrated a four-step CPM technique: 'Loosening the tobacco'; 'Dumping the tobacco'; 'Removing the cigar binder' and 'Repacking the tobacco'. Four themes related to the purpose of CPM were also derived from video content: 'Easier to smoke' (54%), 'Beliefs in reduction of health risks' (31%), 'Changing the burn rate' (15%) and 'Taste enhancement' (12%). Study 2 results concerning the content characteristics of video comments were categorized into three themes: 'Disseminating information/answering questions' (81%), 'Seeking advice/asking questions' (69%) and 'Learning cigar modification techniques' (35%). Favorable comments were more common (81%) compared to unfavorable (58%) and comment content suggested low-risk perceptions and poor understanding of smoking harms. These findings highlight a novel means for youth to access information concerning CPM that may have important implications for tobacco control policy and prevention.

  20. Temporal Trends and Factors Associated with Home Hemodialysis Technique Survival in Canada.

    PubMed

    Perl, Jeffrey; Na, Yingbo; Tennankore, Karthik K; Chan, Christopher T

    2017-07-24

    The last 15 years has seen growth in home hemodialysis (HD) utilization in Canada owing to reports of improved outcomes relative to patients on conventional in-center HD. What effect growth has had on home HD technique and patient survival during this period is not known. We compared the risk of home HD technique failure, mortality, and the composite outcome among three incident cohorts of patients on home HD in Canada: 1996-2002, 2003-2007, and 2008-2012. A multivariable piece-wise exponential model was used to evaluate all outcomes using inverse probability of treatment and censoring weights. A total of 1869 incident patients on home HD were identified from the Canadian Organ Replacement Register. Relative to those treated between 2003 and 2007 ( n =568), the risk of home HD technique failure was similar between patients treated between 1996 and 2002 ( n =233; adjusted hazard ratio [AHR], 1.39; 95% confidence interval [95% CI], 0.78 to 2.46) but higher among incident patients on home HD treated between 2008 and 2012 ( n =1068; AHR, 1.51; 95% CI, 1.06 to 2.15). Relative to patients treated between 2003 and 2007, adjusted mortality was similar among those treated between 2008 and 2012 (AHR, 0.83; 95% CI, 0.58 to 1.19) and those treated between 1996 and 2002 (AHR, 0.67; 95% CI, 0.38 to 1.21). The risk of the composite outcome of death and technique failure was similar across cohorts, as was the risk of receiving a kidney transplant. Increasing age, diabetes as a comorbidity, and smoking status were associated with an increased risk of death as well as the composite outcome. Medium-sized facilities had a lower risk of death, technique failure, and the composite outcome compared with larger facilities. A higher risk of technique failure was seen in the most contemporary era. Further characterization of the risk factors for, and causes of technique failure is needed to develop strategies to improve patient retention on home HD. Copyright © 2017 by the American Society of Nephrology.

  1. Identifying risks in the realm of enterprise risk management.

    PubMed

    Carroll, Roberta

    2016-01-01

    An enterprise risk management (ERM) discipline is comprehensive and organization-wide. The effectiveness of ERM is governed in part by the strength and breadth of its practices and processes. An essential element in decision making is a thorough process by which organizational risks and value opportunities can be identified. This article will offer identification techniques that go beyond those used in traditional risk management programs and demonstrate how these techniques can be used to identify risks and opportunity in the ERM environment. © 2016 American Society for Healthcare Risk Management of the American Hospital Association.

  2. Spatial distribution variation and probabilistic risk assessment of exposure to chromium in ground water supplies; a case study in the east of Iran.

    PubMed

    Fallahzadeh, Reza Ali; Khosravi, Rasoul; Dehdashti, Bahare; Ghahramani, Esmail; Omidi, Fariborz; Adli, Abolfazl; Miri, Mohammad

    2018-05-01

    A high concentration of chromium (VI) in groundwater can threaten the health of consumers. In this study, the concentration of chromium (VI) in 18 drinking water wells in Birjand, Iran, s was investigated over a period of two yearsNon-carcinogenic risk assessment, sensitivity, and uncertainty analysis as well as the most important variables in determining the non-carcinogenic risk for three age groups including children, teens, and adults, were performed using the Monte Carlo simulations technique. The northern and southern regions of the study area had the highest and lowest chromium concentrations, respectively. The chromium concentrations in 16.66% of the samples in an area of 604.79 km2 were more than World Health Organization (WHO) guideline (0.05 mg/L). The Moran's index analysis showed that the distribution of contamination is a cluster. The Hazard Index (HI) values for the children and teens groups were 1.02 and 2.02, respectively, which was more than 1. A sensitivity analysis indicated that the most important factor in calculating the HQ was the concentration of chromium in the consumed water. HQ values higher than 1 represent a high risk for the children group, which should be controlled by removing the chromium concentration of the drinking water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Common side closure type, but not stapler brand or oversewing, influences side-to-side anastomotic leak rates.

    PubMed

    Fleetwood, V A; Gross, K N; Alex, G C; Cortina, C S; Smolevitz, J B; Sarvepalli, S; Bakhsh, S R; Poirier, J; Myers, J A; Singer, M A; Orkin, B A

    2017-03-01

    Anastomotic leak (AL) increases costs and cancer recurrence. Studies show decreased AL with side-to-side stapled anastomosis (SSA), but none identify risk factors within SSAs. We hypothesized that stapler characteristics and closure technique of the common enterotomy affect AL rates. Retrospective review of bowel SSAs was performed. Data included stapler brand, staple line oversewing, and closure method (handsewn, HC; linear stapler [Barcelona technique], BT; transverse stapler, TX). Primary endpoint was AL. Statistical analysis included Fisher's test and logistic regression. 463 patients were identified, 58.5% BT, 21.2% HC, and 20.3% TX. Covidien staplers comprised 74.9%, Ethicon 18.1%. There were no differences between stapler types (Covidien 5.8%, Ethicon 6.0%). However, AL rates varied by common side closure (BT 3.7% vs. TX 10.6%, p = 0.017), remaining significant on multivariate analysis. Closure method of the common side impacts AL rates. Barcelona technique has fewer leaks than transverse stapled closure. Further prospective evaluation is recommended. Copyright © 2017. Published by Elsevier Inc.

  4. Advances in carbonate exploration and reservoir analysis

    USGS Publications Warehouse

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  5. Spatial analysis for the epidemiological study of cardiovascular diseases: A systematic literature search.

    PubMed

    Mena, Carlos; Sepúlveda, Cesar; Fuentes, Eduardo; Ormazábal, Yony; Palomo, Iván

    2018-05-07

    Cardiovascular diseases (CVDs) are the primary cause of death and disability in de world, and the detection of populations at risk as well as localization of vulnerable areas is essential for adequate epidemiological management. Techniques developed for spatial analysis, among them geographical information systems and spatial statistics, such as cluster detection and spatial correlation, are useful for the study of the distribution of the CVDs. These techniques, enabling recognition of events at different geographical levels of study (e.g., rural, deprived neighbourhoods, etc.), make it possible to relate CVDs to factors present in the immediate environment. The systemic literature presented here shows that this group of diseases is clustered with regard to incidence, mortality and hospitalization as well as obesity, smoking, increased glycated haemoglobin levels, hypertension physical activity and age. In addition, acquired variables such as income, residency (rural or urban) and education, contribute to CVD clustering. Both local cluster detection and spatial regression techniques give statistical weight to the findings providing valuable information that can influence response mechanisms in the health services by indicating locations in need of intervention and assignment of available resources.

  6. West Europe Report, Science and Technology, No. 163

    DTIC Science & Technology

    1983-11-10

    recommenda- tion. A representative of the unions thinks that the risk analysis of the com- mission was too limited and the spokesman of the...Gist-Brocades in Delft, Unilever in Vlaardingen and Organon (Akzo Pharma) in Oss. A few other firms are considering to start with this technique...large multinationals which already were engaged in the application of bio- technology. Ir R. Keuning, member of the board of Unilever Research Labor

  7. USING THE DELPHI TECHNIQUE TO DEVELOP EFFECTIVENESS INDICATORS FOR SOCIAL MARKETING COMMUNICATION TO REDUCE HEALTH-RISK BEHAVIORS AMONG YOUTH.

    PubMed

    Vantamay, Nottakrit

    2015-09-01

    This study aimed to develop effectiveness indicators for social marketing communication to reduce health-risk behaviors among Thai youth by using the Delphi technique. The Delphi technique is a research approach used to gain consensus through a series of two or more rounds of questionnaire surveys where information and results are fed back to panel members between each round and it has been extensively used to generate many indicators relevant to health behaviors. The Delphi technique was conducted in 3 rounds by consulting a panel of 15 experts in the field of social marketing communication for public health campaigns in Thailand. We found forty-nine effectiveness indicators in eight core components reached consensus. These components were: 1) attitude about health-risk behavior reduction, 2) subjective norms, 3) perceived behavioral control, 4) intention to reduce health-risk behaviors, 5) practices for reducing health-risk behaviors, 6) knowledge about the dangers and impact of health-risk behaviors, 7) campaign brand equity, and 8) communication networks. These effectiveness indicators could be applied by health promotion organizations for evaluating the effectiveness of social marketing communication to effectively reduce health-risk behaviors among youth.

  8. Selection of Representative Models for Decision Analysis Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  9. Hyperspectral range imaging for transportation systems evaluation

    NASA Astrophysics Data System (ADS)

    Bridgelall, Raj; Rafert, J. B.; Atwood, Don; Tolliver, Denver D.

    2016-04-01

    Transportation agencies expend significant resources to inspect critical infrastructure such as roadways, railways, and pipelines. Regular inspections identify important defects and generate data to forecast maintenance needs. However, cost and practical limitations prevent the scaling of current inspection methods beyond relatively small portions of the network. Consequently, existing approaches fail to discover many high-risk defect formations. Remote sensing techniques offer the potential for more rapid and extensive non-destructive evaluations of the multimodal transportation infrastructure. However, optical occlusions and limitations in the spatial resolution of typical airborne and space-borne platforms limit their applicability. This research proposes hyperspectral image classification to isolate transportation infrastructure targets for high-resolution photogrammetric analysis. A plenoptic swarm of unmanned aircraft systems will capture images with centimeter-scale spatial resolution, large swaths, and polarization diversity. The light field solution will incorporate structure-from-motion techniques to reconstruct three-dimensional details of the isolated targets from sequences of two-dimensional images. A comparative analysis of existing low-power wireless communications standards suggests an application dependent tradeoff in selecting the best-suited link to coordinate swarming operations. This study further produced a taxonomy of specific roadway and railway defects, distress symptoms, and other anomalies that the proposed plenoptic swarm sensing system would identify and characterize to estimate risk levels.

  10. Spatial diffusion of influenza outbreak-related climate factors in Chiang Mai Province, Thailand.

    PubMed

    Nakapan, Supachai; Tripathi, Nitin Kumar; Tipdecho, Taravudh; Souris, Marc

    2012-10-24

    Influenza is one of the most important leading causes of respiratory illness in the countries located in the tropical areas of South East Asia and Thailand. In this study the climate factors associated with influenza incidence in Chiang Mai Province, Northern Thailand, were investigated. Identification of factors responsible for influenza outbreaks and the mapping of potential risk areas in Chiang Mai are long overdue. This work examines the association between yearly climate patterns between 2001 and 2008 and influenza outbreaks in the Chiang Mai Province. The climatic factors included the amount of rainfall, percent of rainy days, relative humidity, maximum, minimum temperatures and temperature difference. The study develops a statistical analysis to quantitatively assess the relationship between climate and influenza outbreaks and then evaluate its suitability for predicting influenza outbreaks. A multiple linear regression technique was used to fit the statistical model. The Inverse Distance Weighted (IDW) interpolation and Geographic Information System (GIS) techniques were used in mapping the spatial diffusion of influenza risk zones. The results show that there is a significance correlation between influenza outbreaks and climate factors for the majority of the studied area. A statistical analysis was conducted to assess the validity of the model comparing model outputs and actual outbreaks.

  11. Perioperative Events and Complications in Minimally Invasive Live Donor Nephrectomy: A Systematic Review and Meta-Analysis.

    PubMed

    Kortram, Kirsten; Ijzermans, Jan N M; Dor, Frank J M F

    2016-11-01

    Minimally invasive live donor nephrectomy has become a fully implemented and accepted procedure. Donors have to be well educated about all risks and details during the informed consent process. For this to be successful, more information regarding short-term outcome is necessary. A literature search was performed; all studies discussing short-term complications after minimally invasive live donor nephrectomy were included. Outcomes evaluated were intraoperative and postoperative complications, conversions, operative and warm ischemia times, blood loss, length of hospital stay, pain score, convalescence, quality of life, and costs. One hundred ninety articles were included in the systematic review, 41 in the meta-analysis. Conversion rate was 1.1%. Intraoperative complication rate was 2.3%, mainly bleeding (1.5%). Postoperative complications occurred in 7.3% of donors, including infectious complications (2.6%), of which mainly wound infection (1.6%) and bleeding (1.0%). Reported mortality rate was 0.01%. All minimally invasive techniques were comparable with regard to complication or conversion rate. The used techniques for minimally invasive live donor nephrectomy are safe and associated with low complication rates and minimal risk of mortality. These data may be helpful to develop a standardized, donor-tailored informed consent procedure for live donor nephrectomy.

  12. Meta-analysis of individual registry results enhances international registry collaboration.

    PubMed

    Paxton, Elizabeth W; Mohaddes, Maziar; Laaksonen, Inari; Lorimer, Michelle; Graves, Stephen E; Malchau, Henrik; Namba, Robert S; Kärrholm, John; Rolfson, Ola; Cafri, Guy

    2018-03-28

    Background and purpose - Although common in medical research, meta-analysis has not been widely adopted in registry collaborations. A meta-analytic approach in which each registry conducts a standardized analysis on its own data followed by a meta-analysis to calculate a weighted average of the estimates allows collaboration without sharing patient-level data. The value of meta-analysis as an alternative to individual patient data analysis is illustrated in this study by comparing the risk of revision of porous tantalum cups versus other uncemented cups in primary total hip arthroplasties from Sweden, Australia, and a US registry (2003-2015). Patients and methods - For both individual patient data analysis and meta-analysis approaches a Cox proportional hazard model was fit for time to revision, comparing porous tantalum (n = 23,201) with other uncemented cups (n = 128,321). Covariates included age, sex, diagnosis, head size, and stem fixation. In the meta-analysis approach, treatment effect size (i.e., Cox model hazard ratio) was calculated within each registry and a weighted average for the individual registries' estimates was calculated. Results - Patient-level data analysis and meta-analytic approaches yielded the same results with the porous tantalum cups having a higher risk of revision than other uncemented cups (HR (95% CI) 1.6 (1.4-1.7) and HR (95% CI) 1.5 (1.4-1.7), respectively). Adding the US cohort to the meta-analysis led to greater generalizability, increased precision of the treatment effect, and similar findings (HR (95% CI) 1.6 (1.4-1.7)) with increased risk of porous tantalum cups. Interpretation - The meta-analytic technique is a viable option to address privacy, security, and data ownership concerns allowing more expansive registry collaboration, greater generalizability, and increased precision of treatment effects.

  13. Somatotype of the individuals with lower extremity amputation and its association with cardiovascular risk.

    PubMed

    Mozumdar, Arupendra; Roy, Subrata K

    2008-03-01

    Anthropometric somatotyping is one of the methods to describe the shape of the human body, which shows some associations with an individual's health and disease condition, especially with cardiovascular diseases (CVD). Individuals with lower extremity amputation (LEA) are known to be more vulnerable to the cardiovascular risk. The objectives of the present study are to report the somatotype of the individuals having lower extremity amputation, to study the possible variation in somatotype between two groups of amputated individuals, and to study the association between cardiovascular disease risk factor and somatotype components among individuals with locomotor disability. 102 adult male individuals with unilateral lower-extremity amputation residing in Calcutta and adjoining areas were investigated. The anthropometric data for somatotyping and data on cardiovascular risk traits (such as body mass index, blood pressure measurements, blood lipids) have been collected. The somatotyping technique of Carter & Heath (1990) has been followed. The result shows high mean values of endomorphy and mesomorphy components and a low mean value of the ectomorphy component among the amputated individuals having cardiovascular risks. The results of both discriminant analysis and logistic regression analysis show a significant relationship between somatotype components and CVD risk among the individuals with LEA. The findings of the present study support the findings of similar studies conducted on the normal population. Diagnosis of CVD risk condition through somatotyping can be utilized in prevention/treatment management for the individuals with LEA.

  14. The effect of differences in time to detection of circulating microbubbles on the risk of decompression sickness

    NASA Technical Reports Server (NTRS)

    Kumar, K. V.; Gilbert, J. H.; Powell, M. R.; Waligora, J. M.

    1992-01-01

    Circulating microbubbles (CMB) are frequently detected prior to the appearance of symptoms of Decompression Sickness (DCS). It is difficult to analyze the effect of CMB on symptoms due to differences in the time to detection of CMB. This paper uses survival analysis models to evaluate the risk of symptoms in the presence of CMB. Methods: Information on 81 exposures to an altitude of 6,400 m (6.5 psi) for a period of three hours, with simulated extravehicular activities, was examined. The presence or absence of CMB was included as a time dependent covariate of the Cox proportional hazards regression model. Using this technique, the subgroup of exposures with CMB was analyzed further. Mean (S.D.) time in minutes to onset of CMB and symptoms were 125 (63) and 165 (33) respectively, following the three hours exposure. The risk of symptoms (17/81) increased 14 times in the presence of CMB, after controlling for variations in time to detection of CMB. Further, the risk was lower when time to detection of CMB was greater than 60 minutes (risk ratio = 0.96; 95 percent confidence intervals = 0.94 - 0.99 0.99 P less than 0.01) compared to CMB before 60 minutes at altitude. Conclusions: Survival analysis showed that individual risk of DCS changes significantly due to variations in time to detection of CMB. This information is important in evaluating the risk of DCS in the presence of CMB.

  15. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  16. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  17. Zero mortality in more than 300 hepatic resections: validity of preoperative volumetric analysis.

    PubMed

    Itoh, Shinji; Shirabe, Ken; Taketomi, Akinobu; Morita, Kazutoyo; Harimoto, Norifumi; Tsujita, Eiji; Sugimachi, Keishi; Yamashita, Yo-Ichi; Gion, Tomonobu; Maehara, Yoshihiko

    2012-05-01

    We reviewed a series of patients who underwent hepatic resection at our institution, to investigate the risk factors for postoperative complications after hepatic resection of liver tumors and for procurement of living donor liver transplantation (LDLT) grafts. Between April 2004 and August 2007, we performed 304 hepatic resections for liver tumors or to procure grafts for LDLT. Preoperative volumetric analysis was done using 3-dimensional computed tomography (3D-CT) prior to major hepatic resection. We compared the clinicopathological factors between patients with and without postoperative complications. There was no operative mortality. According to the 3D-CT volumetry, the mean error ratio between the actual and the estimated remnant liver volume was 13.4%. Postoperative complications developed in 96 (31.6%) patients. According to logistic regression analysis, histological liver cirrhosis and intraoperative blood loss >850 mL were significant risk factors of postoperative complications after hepatic resection. Meticulous preoperative evaluation based on volumetric analysis, together with sophisticated surgical techniques, achieved zero mortality and minimized intraoperative blood loss, which was classified as one of the most significant predictors of postoperative complications after major hepatic resection.

  18. Systems thinking applied to safety during manual handling tasks in the transport and storage industry.

    PubMed

    Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter

    2014-07-01

    Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  20. Practical Techniques for Modeling Gas Turbine Engine Performance

    NASA Technical Reports Server (NTRS)

    Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2016-01-01

    The cost and risk associated with the design and operation of gas turbine engine systems has led to an increasing dependence on mathematical models. In this paper, the fundamentals of engine simulation will be reviewed, an example performance analysis will be performed, and relationships useful for engine control system development will be highlighted. The focus will be on thermodynamic modeling utilizing techniques common in industry, such as: the Brayton cycle, component performance maps, map scaling, and design point criteria generation. In general, these topics will be viewed from the standpoint of an example turbojet engine model; however, demonstrated concepts may be adapted to other gas turbine systems, such as gas generators, marine engines, or high bypass aircraft engines. The purpose of this paper is to provide an example of gas turbine model generation and system performance analysis for educational uses, such as curriculum creation or student reference.

  1. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Optimizing protocols for risk prediction in asymptomatic carotid stenosis using embolic signal detection: the Asymptomatic Carotid Emboli Study.

    PubMed

    King, Alice; Shipley, Martin; Markus, Hugh

    2011-10-01

    Improved methods are required to identify patients with asymptomatic carotid stenosis at high risk for stroke. The Asymptomatic Carotid Emboli Study recently showed embolic signals (ES) detected by transcranial Doppler on 2 recordings that lasted 1-hour independently predict 2-year stroke risk. ES detection is time-consuming, and whether similar predictive information could be obtained from simpler recording protocols is unknown. In a predefined secondary analysis of Asymptomatic Carotid Emboli Study, we looked at the temporal variation of ES. We determined the predictive yield associated with different recording protocols and with the use of a higher threshold to indicate increased risk (≥2 ES). To compare the different recording protocols, sensitivity and specificity analyses were performed using analysis of receiver-operator characteristic curves. Of 477 patients, 467 had baseline recordings adequate for analysis; 77 of these had ES on 1 or both of the 2 recordings. ES status on the 2 recordings was significantly associated (P<0.0001), but there was poor agreement between ES positivity on the 2 recordings (κ=0.266). For the primary outcome of ipsilateral stroke or transient ischemic attack, the use of 2 baseline recordings lasting 1 hour had greater predictive accuracy than either the first baseline recording alone (P=0.0005), a single 30-minute (P<0.0001) recording, or 2 recordings lasting 30 minutes (P<0.0001). For the outcome of ipsilateral stroke alone, two recordings lasting 1 hour had greater predictive accuracy when compared to all other recording protocols (all P<0.0001). Our analysis demonstrates the relative predictive yield of different recording protocols that can be used in application of the technique in clinical practice. Two baseline recordings lasting 1 hour as used in Asymptomatic Carotid Emboli Study gave the best risk prediction.

  3. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  4. [Analysis of lifestyle and risk factors of atherosclerosis in students of selected universities in Krakow].

    PubMed

    Skrzypek, Agnieszka; Szeliga, Marta; Stalmach-Przygoda, Agata; Kowalska, Bogumila; Jabłoński, Konrad; Nowakowski, Michal

    Reduction of risk factors of atherosclerosis, lifestyle modification significantly cause the reduction in the incidence, morbidity and mortality of cardiovascular diseases (CVDs). Objective: To evaluate cardiovascular risk factors and analyze the lifestyle of students finishing the first year of studies at selected universities in Krakow. The study was performed in 2015roku. 566 students finishing the first year of study, including 319 (56.4%) men and 247 (43.6%) women were examined. The students were in age from 18 to 27 years, an average of 20.11± 1.15 years. They represented 6 different universities in Cracow. In order to assess eating habits, lifestyle and analysis of risk factors of cardiovascular disease was used method of diagnostic survey using the survey technique. BMI was calculated from anthropometric measurements. The program Statistica 12.0 were used in statistical analysis. The analysis showed that most fruits and vegetables consume UR students and AWF, least of AGH. Only 34.8% of students regularly consume fish of the sea, there were no significant differences between universities. Sports frequently cultivate the students of AWF (93% of the students of this university). Academy of Fine Arts students drink the most coffee. Students of AGH frequently consume alcohol. 60% of all students never tried drugs, but only 25.7% of student of Fine Arts never tried drugs. Overweight occurs in 12.6% of students, and obesity in 1.1%. The most risk factors of atherosclerosis occur in students of AGH and ASP. The results of the study clearly indicate on the necessity of implementation of prevention and improvement of health behaviors in students of AGH and ASP universities.

  5. A comparison of partial order technique with three methods of multi-criteria analysis for ranking of chemical substances.

    PubMed

    Lerche, Dorte; Brüggemann, Rainer; Sørensen, Peter; Carlsen, Lars; Nielsen, Ole John

    2002-01-01

    An alternative to the often cumbersome and time-consuming risk assessments of chemical substances could be more reliable and advanced priority setting methods. An elaboration of the simple scoring methods is provided by Hasse Diagram Technique (HDT) and/or Multi-Criteria Analysis (MCA). The present study provides an in depth evaluation of HDT relative to three MCA techniques. The new and main methodological step in the comparison is the use of probability concepts based on mathematical tools such as linear extensions of partially ordered sets and Monte Carlo simulations. A data set consisting of 12 High Production Volume Chemicals (HPVCs) is used for illustration. It is a paradigm in this investigation to claim that the need of external input (often subjective weightings of criteria) should be minimized and that the transparency should be maximized in any multicriteria prioritisation. The study illustrates that the Hasse diagram technique (HDT) needs least external input, is most transparent and is least subjective. However, HDT has some weaknesses if there are criteria which exclude each other. Then weighting is needed. Multi-Criteria Analysis (i.e. Utility Function approach, PROMETHEE and concordance analysis) can deal with such mutual exclusions because their formalisms to quantify preferences allow participation e.g. weighting of criteria. Consequently MCA include more subjectivity and loose transparency. The recommendation which arises from this study is that the first step in decision making is to run HDT and as the second step possibly is to run one of the MCA algorithms.

  6. Postmortem validation of breast density using dual-energy mammography

    PubMed Central

    Molloi, Sabee; Ducote, Justin L.; Ding, Huanjun; Feig, Stephen A.

    2014-01-01

    Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decomposition was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer. PMID:25086548

  7. Postmortem validation of breast density using dual-energy mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molloi, Sabee, E-mail: symolloi@uci.edu; Ducote, Justin L.; Ding, Huanjun

    2014-08-15

    Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decompositionmore » was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer.« less

  8. The Effect of Preoperative Antimicrobial Prophylaxis on Intraoperative Culture Results in Patients with a Suspected or Confirmed Prosthetic Joint Infection: a Systematic Review

    PubMed Central

    Benito, Natividad; Soriano, Alex

    2017-01-01

    ABSTRACT Obtaining reliable cultures during revision arthroplasty is important to adequately diagnose and treat a prosthetic joint infection (PJI). The influence of antimicrobial prophylaxis on culture results remains unclear. Since withholding prophylaxis increases the risk for surgical site infections, clarification on this topic is critical. A systematic review was performed with the following research question: in patients who undergo revision surgery of a prosthetic joint, does preoperative antimicrobial prophylaxis affect the culture yield of intraoperative samples in comparison with nonpreoperative antimicrobial prophylaxis? Seven articles were included in the final analysis. In most studies, standard diagnostic culture techniques were used. In patients with a PJI, pooled analysis showed a culture yield of 88% (145/165) in the prophylaxis group versus 95% (344/362) in the nonprophylaxis group (P = 0.004). Subanalysis of patients with chronic PJIs showed positive cultures in 88% (78/89) versus 91% (52/57), respectively (P = 0.59). In patients with a suspected chronic infection, a maximum difference of 4% in culture yield between the prophylaxis and nonprophylaxis groups was observed. With the use of standard culture techniques, antimicrobial prophylaxis seems to affect cultures in a minority of patients. Along with the known risk of surgical site infections due to inadequate timing of antimicrobial prophylaxis, we discourage the postponement of prophylaxis until tissue samples are obtained in revision surgery. Future studies are necessary to conclude whether the small percentage of false-negative cultures after prophylaxis can be further reduced with the use of more-sensitive culture techniques, like sonication. PMID:28659322

  9. Mortality in patients with acute aortic dissection type A: analysis of pre- and intraoperative risk factors from the German Registry for Acute Aortic Dissection Type A (GERAADA).

    PubMed

    Conzelmann, Lars Oliver; Weigang, Ernst; Mehlhorn, Uwe; Abugameh, Ahmad; Hoffmann, Isabell; Blettner, Maria; Etz, Christian D; Czerny, Martin; Vahl, Christian F

    2016-02-01

    Acute aortic dissection type A (AADA) is an emergency with excessive mortality if surgery is delayed. Knowledge about independent predictors of mortality on surgically treated AADA patients is scarce. Therefore, this study was conducted to identify pre- and intraoperative risk factors for death. Between July 2006 and June 2010, 2137 surgically treated patients with AADA were enrolled in a multicentre, prospective German Registry for Acute Aortic Dissection type A (GERAADA), presenting perioperative status, operative strategies, postoperative outcomes and AADA-related risk factors for death. Multiple logistic regression analysis was performed to identify the influence of different parameters on 30-day mortality. Overall 30-day mortality (16.9%) increased with age [adjusted odds ratio (OR) = 1.121] and among patients who were comatose (adjusted OR = 3.501) or those who underwent cardiopulmonary resuscitation (adjusted OR = 3.751; all P < 0.0001). The higher the number of organs that were malperfused, the risk for death was (adjusted OR for one organ = 1.651, two organs = 2.440, three organs or more = 3.393, P < 0.0001). Mortality increased with longer operating times (total, cardiopulmonary bypass, cardiac ischaemia and circulatory arrest; all P < 0.02). Arterial cannulation site for extracorporeal circulation, operative techniques and arch interventions had no significant impact on 30-day mortality (all P > 0.1). No significant risk factors, but relevant increases in mortality, were determined in patients suffering from hemiparesis pre- and postoperatively (each P < 0.01), and in patients experiencing paraparesis after surgery (P < 0.02). GERAADA could detect significant disease- and surgery-related risk factors for death in AADA, influencing the outcome of surgically treated AADA patients. Comatose and resuscitated patients have the poorest outcome. Cannulation sites and operative techniques did not seem to affect mortality. Short operative times are associated with better outcomes. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  10. Critical review of methods for risk ranking of food-related hazards, based on risks for human health.

    PubMed

    Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2018-01-22

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  11. Assessment of skin exposure to nickel, chromium and cobalt by acid wipe sampling and ICP-MS.

    PubMed

    Lidén, Carola; Skare, Lizbet; Lind, Birger; Nise, Gun; Vahter, Marie

    2006-05-01

    There is a great need to accurately assess skin exposure to contact allergens. We have developed a technique for assessment of skin exposure to nickel, chromium and cobalt using acid wipe sampling by cellulose wipes with 1% nitric acid. Chemical analysis was performed by inductively coupled plasma mass spectrometry (ICP-MS). The recovery of nickel, chromium and cobalt from arms and palms was 93%. The analytical result is expressed in terms of mass per unit area (microg/cm(2)). The developed acid wipe sampling technique is suitable for determination of nickel, chromium and cobalt deposited on the skin. The technique may be used in workplace studies, in studies of individuals in the general population, in dermatitis patients, in identification of risk groups, as well as in developing preventive strategies and in follow-up after intervention.

  12. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less

  13. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements

  14. Quantifying risk and accuracy in cancer risk assessment: the process and its role in risk management problem-solving.

    PubMed

    Turturro, A; Hart, R W

    1987-01-01

    A better understanding of chemical-induced cancer has led to appreciation of similarities to problems addressed by risk management of radiation-induced toxicity. Techniques developed for cancer risk assessment of toxic substances can be generalized to toxic agents. A recent problem-solving approach for risk management of toxic substances developed for the U.S. Department of Health and Human Services, and the role of risk assessment and how uncertainty should be treated within the context of this approach, is discussed. Finally, two different methods, research into the assumptions underlying risk assessment and the modification of risk assessment/risk management documents, are used to illustrate how the technique can be applied.

  15. Threats and risks to information security: a practical analysis of free access wireless networks

    NASA Astrophysics Data System (ADS)

    Quirumbay, Daniel I.; Coronel, Iván. A.; Bayas, Marcia M.; Rovira, Ronald H.; Gromaszek, Konrad; Tleshova, Akmaral; Kozbekova, Ainur

    2017-08-01

    Nowadays, there is an ever-growing need to investigate, consult and communicate through the internet. This need leads to the intensification of free access to the web in strategic and functional points for the benefit of the community. However, this open access is also related to the increase of information insecurity. The existing works on computer security primarily focus on the development of techniques to reduce cyber-attacks. However, these approaches do not address the sector of inexperienced users who have difficulty understanding browser settings. Two methods can solve this problem: first the development of friendly browsers with intuitive setups for new users and on the other hand, by implementing awareness programs on essential security without deepening on technical information. This article addresses an analysis of the vulnerabilities of wireless equipment that provides internet service in the open access zones and the potential risks that could be found when using these means.

  16. Group decision making with the analytic hierarchy process in benefit-risk assessment: a tutorial.

    PubMed

    Hummel, J Marjan; Bridges, John F P; IJzerman, Maarten J

    2014-01-01

    The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria. This tutorial illustrates the procedural steps of the AHP in supporting group decision making about new healthcare technology, including (1) identifying the decision goal, decision criteria, and alternative healthcare technologies to compare, (2) structuring the decision criteria, (3) judging the value of the alternative technologies on each decision criterion, (4) judging the importance of the decision criteria, (5) calculating group judgments, (6) analyzing the inconsistency in judgments, (7) calculating the overall value of the technologies, and (8) conducting sensitivity analyses. The AHP is illustrated via a hypothetical example, adapted from an empirical AHP analysis on the benefits and risks of tissue regeneration to repair small cartilage lesions in the knee.

  17. Pneumothorax Complicating Coaxial and Non-coaxial CT-Guided Lung Biopsy: Comparative Analysis of Determining Risk Factors and Management of Pneumothorax in a Retrospective Review of 650 Patients.

    PubMed

    Nour-Eldin, Nour-Eldin A; Alsubhi, Mohammed; Emam, Ahmed; Lehnert, Thomas; Beeres, Martin; Jacobi, Volkmar; Gruber-Rouh, Tatjana; Scholtz, Jan-Erik; Vogl, Thomas J; Naguib, Nagy N

    2016-02-01

    To assess the scope and determining risk factors related to the development of pneumothorax throughout CT-guided biopsy of pulmonary lesions in coaxial and non-coaxial techniques and the outcome of its management. The study included CT-guided percutaneous lung biopsies in 650 consecutive patients (407 males, 243 females; mean age 54.6 years, SD 5.2) from November 2008 to June 2013 in a retrospective design. Patients were classified according to lung biopsy technique into coaxial group (318 lesions) and non-coaxial group (332 lesions). Exclusion criteria for biopsy were lesions <5 mm in diameter, uncorrectable coagulopathy, positive-pressure ventilation, severe respiratory compromise, pulmonary arterial hypertension, or refusal of the procedure. Risk factors related to the occurrence of pneumothorax were classified into: (a) Technical risk factors, (b) patient-related risk factors, and (c) lesion-associated risk factors. Radiological assessments were performed by two radiologists in consensus. Mann-Whitney U test and Fisher's exact tests were used for statistical analysis. p values <0.05 were considered statistically significant. The incidence of pneumothorax complicating CT-guided lung biopsy was less in the non-coaxial group (23.2 %, 77 out of 332) than the coaxial group (27 %, 86 out of 318). However, the difference in incidence between both groups was statistically insignificant (p = 0.14). Significant risk factors for the development of pneumothorax in both groups were emphysema (p < 0.001 in both groups), traversing a fissure with the biopsy needle (p value 0.005 in non-coaxial group and 0.001 in coaxial group), small lesion, less than 2 cm in diameter (p value of 0.02 in both groups), location of the lesion in the basal or mid sections of the lung (p = 0.003 and <0.001 in non-coaxial and coaxial groups, respectively), and increased needle track path within the lung tissue of more than 2.5 cm (p = 0.01 in both groups). The incidence of pneumothorax in the non-coaxial group was significantly correlated to the number of specimens obtained (p = 0.006). This factor was statistically insignificant in the coaxial group (p = 0.45). The biopsy yield was more diagnostic and conclusive in the coaxial group in comparison to the non-coaxial group (p = 0.008). Simultaneous incidence of pneumothorax and pulmonary hemorrhage was 27.3 % (21/77) in non-coaxial group and in 30.2 % (26/86) in coaxial group. Conservative management was sufficient for treatment of 91 out of 101 patients of pneumothorax in both groups (90.1 %). Manual evacuation of pneumothorax was efficient in 44/51 patients (86.3 %) in both groups and intercostal chest tube was applied after failure of manual evacuation (7 patients: 13.7 %), from which one patient developed a persistent air leakage necessitating pleurodesis. Pneumothorax complicating CT-guided core biopsy of pulmonary lesions, showed the insignificant difference between coaxial and non-coaxial techniques. However, both techniques have the same significant risk factors including small and basal lesions, increased lesion's depth from pleural surface, and increased length of aerated lung parenchyma crossed by biopsy needle and passing through pulmonary fissures in the needle tract.

  18. Pneumothorax Complicating Coaxial and Non-coaxial CT-Guided Lung Biopsy: Comparative Analysis of Determining Risk Factors and Management of Pneumothorax in a Retrospective Review of 650 Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nour-Eldin, Nour-Eldin A., E-mail: nour410@hotmail.com; Alsubhi, Mohammed, E-mail: mohammedal-subhi@yahoo.com; Emam, Ahmed, E-mail: morgan101002@hotmail.com

    PurposeTo assess the scope and determining risk factors related to the development of pneumothorax throughout CT-guided biopsy of pulmonary lesions in coaxial and non-coaxial techniques and the outcome of its management.Materials and MethodsThe study included CT-guided percutaneous lung biopsies in 650 consecutive patients (407 males, 243 females; mean age 54.6 years, SD 5.2) from November 2008 to June 2013 in a retrospective design. Patients were classified according to lung biopsy technique into coaxial group (318 lesions) and non-coaxial group (332 lesions). Exclusion criteria for biopsy were lesions <5 mm in diameter, uncorrectable coagulopathy, positive-pressure ventilation, severe respiratory compromise, pulmonary arterial hypertension, ormore » refusal of the procedure. Risk factors related to the occurrence of pneumothorax were classified into: (a) Technical risk factors, (b) patient-related risk factors, and (c) lesion-associated risk factors. Radiological assessments were performed by two radiologists in consensus. Mann–Whitney U test and Fisher’s exact tests were used for statistical analysis. p values <0.05 were considered statistically significant.ResultsThe incidence of pneumothorax complicating CT-guided lung biopsy was less in the non-coaxial group (23.2 %, 77 out of 332) than the coaxial group (27 %, 86 out of 318). However, the difference in incidence between both groups was statistically insignificant (p = 0.14). Significant risk factors for the development of pneumothorax in both groups were emphysema (p < 0.001 in both groups), traversing a fissure with the biopsy needle (p value 0.005 in non-coaxial group and 0.001 in coaxial group), small lesion, less than 2 cm in diameter (p value of 0.02 in both groups), location of the lesion in the basal or mid sections of the lung (p = 0.003 and <0.001 in non-coaxial and coaxial groups, respectively), and increased needle track path within the lung tissue of more than 2.5 cm (p = 0.01 in both groups). The incidence of pneumothorax in the non-coaxial group was significantly correlated to the number of specimens obtained (p = 0.006). This factor was statistically insignificant in the coaxial group (p = 0.45). The biopsy yield was more diagnostic and conclusive in the coaxial group in comparison to the non-coaxial group (p = 0.008). Simultaneous incidence of pneumothorax and pulmonary hemorrhage was 27.3 % (21/77) in non-coaxial group and in 30.2 % (26/86) in coaxial group. Conservative management was sufficient for treatment of 91 out of 101 patients of pneumothorax in both groups (90.1 %). Manual evacuation of pneumothorax was efficient in 44/51 patients (86.3 %) in both groups and intercostal chest tube was applied after failure of manual evacuation (7 patients: 13.7 %), from which one patient developed a persistent air leakage necessitating pleurodesis.ConclusionPneumothorax complicating CT-guided core biopsy of pulmonary lesions, showed the insignificant difference between coaxial and non-coaxial techniques. However, both techniques have the same significant risk factors including small and basal lesions, increased lesion’s depth from pleural surface, and increased length of aerated lung parenchyma crossed by biopsy needle and passing through pulmonary fissures in the needle tract.« less

  19. Security risk assessment: applying the concepts of fuzzy logic.

    PubMed

    Bajpai, Shailendra; Sachdeva, Anish; Gupta, J P

    2010-01-15

    Chemical process industries (CPI) handling hazardous chemicals in bulk can be attractive targets for deliberate adversarial actions by terrorists, criminals and disgruntled employees. It is therefore imperative to have comprehensive security risk management programme including effective security risk assessment techniques. In an earlier work, it has been shown that security risk assessment can be done by conducting threat and vulnerability analysis or by developing Security Risk Factor Table (SRFT). HAZOP type vulnerability assessment sheets can be developed that are scenario based. In SRFT model, important security risk bearing factors such as location, ownership, visibility, inventory, etc., have been used. In this paper, the earlier developed SRFT model has been modified using the concepts of fuzzy logic. In the modified SRFT model, two linguistic fuzzy scales (three-point and four-point) are devised based on trapezoidal fuzzy numbers. Human subjectivity of different experts associated with previous SRFT model is tackled by mapping their scores to the newly devised fuzzy scale. Finally, the fuzzy score thus obtained is defuzzyfied to get the results. A test case of a refinery is used to explain the method and compared with the earlier work.

  20. Long-Term Marine Traffic Monitoring for Environmental Safety in the Aegean Sea

    NASA Astrophysics Data System (ADS)

    Giannakopoulos, T.; Gyftakis, S.; Charou, E.; Perantonis, S.; Nivolianitou, Z.; Koromila, I.; Makrygiorgos, A.

    2015-04-01

    The Aegean Sea is characterized by an extremely high marine safety risk, mainly due to the significant increase of the traffic of tankers from and to the Black Sea that pass through narrow straits formed by the 1600 Greek islands. Reducing the risk of a ship accident is therefore vital to all socio-economic and environmental sectors. This paper presents an online long-term marine traffic monitoring work-flow that focuses on extracting aggregated vessel risks using spatiotemporal analysis of multilayer information: vessel trajectories, vessel data, meteorological data, bathymetric / hydrographic data as well as information regarding environmentally important areas (e.g. protected high-risk areas, etc.). A web interface that enables user-friendly spatiotemporal queries is implemented at the frontend, while a series of data mining functionalities extracts aggregated statistics regarding: (a) marine risks and accident probabilities for particular areas (b) trajectories clustering information (c) general marine statistics (cargo types, etc.) and (d) correlation between spatial environmental importance and marine traffic risk. Towards this end, a set of data clustering and probabilistic graphical modelling techniques has been adopted.

  1. A systematic comparison of the closed shoulder reduction techniques.

    PubMed

    Alkaduhimi, H; van der Linde, J A; Willigenburg, N W; van Deurzen, D F P; van den Bekerom, M P J

    2017-05-01

    To identify the optimal technique for closed reduction for shoulder instability, based on success rates, reduction time, complication risks, and pain level. A PubMed and EMBASE query was performed, screening all relevant literature of closed reduction techniques mentioning the success rate written in English, Dutch, German, and Arabic. Studies with a fracture dislocation or lacking information on success rates for closed reduction techniques were excluded. We used the modified Coleman Methodology Score (CMS) to assess the quality of included studies and excluded studies with a poor methodological quality (CMS < 50). Finally, a meta-analysis was performed on the data from all studies combined. 2099 studies were screened for their title and abstract, of which 217 studies were screened full-text and finally 13 studies were included. These studies included 9 randomized controlled trials, 2 retrospective comparative studies, and 2 prospective non-randomized comparative studies. A combined analysis revealed that the scapular manipulation is the most successful (97%), fastest (1.75 min), and least painful reduction technique (VAS 1,47); the "Fast, Reliable, and Safe" (FARES) method also scores high in terms of successful reduction (92%), reduction time (2.24 min), and intra-reduction pain (VAS 1.59); the traction-countertraction technique is highly successful (95%), but slower (6.05 min) and more painful (VAS 4.75). For closed reduction of anterior shoulder dislocations, the combined data from the selected studies indicate that scapular manipulation is the most successful and fastest technique, with the shortest mean hospital stay and least pain during reduction. The FARES method seems the best alternative.

  2. Frontobasal Midline Meningiomas: Is It Right To Shed Doubt on the Transcranial Approaches? Updates and Review of the Literature.

    PubMed

    Ruggeri, Andrea Gennaro; Cappelletti, Martina; Fazzolari, Benedetta; Marotta, Nicola; Delfini, Roberto

    2016-04-01

    Traditionally, the surgical removal of tuberculum sellae meningioma (TSM) and olfactory groove meningioma (OGM) requires transcranial approaches and microsurgical techniques, but in the last decade endoscopic expanded endonasal approaches have been introduced: transcribriform for OGMs and transtuberculum-transplanum for TSM. A comparative analysis of the literature concerning the two types of surgical treatment of OGMs and TSM is, however, difficult. We conducted a literature search using the PubMed database to compare data for endoscopic and microsurgical techniques in the literature. We also conducted a retrospective analysis of selected cases from our series presenting favorable characteristics for an endoscopic approach, based on the criteria of operability of these lesions as generally accepted in the literature, and we compared the results obtained in these patients with those in the endoscopic literature. We believe that making the sample more homogeneous, the difference between microsurgical technique and endoscopic technique is no longer so striking. A greater radical removal rate, a reduced incidence of cerebrospinal fluid fistula and, especially, the possibility of removing lesions of any size are advantages of transcranial surgery; a higher percentage of improvement in visual outcome and a lower risk of a worsening of a pre-existing deficit or onset of a new deficit are advantages of the endoscopic technique. At present, the microsurgical technique is still the gold standard for the removal of the anterior cranial fossa meningiomas of all sizes, and the endoscopic technique remains a second option in certain cases. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Classroom Techniques for Improving Black Male Student Retention.

    ERIC Educational Resources Information Center

    Gardenhire, John Fouts

    Institutions of higher learning must focus on new ways to serve the at-risk student and the black male at-risk student in particular. By developing and implementing a plan, any teacher can foster retention of at-risk students, even in the absence of institutional support. Twenty effective techniques are: (1) learn students' names; (2) assign…

  4. [Prevalence and risk factors of Enterobius vermicularis among preschool children in kindergartens in Luohu District, Shenzhen City].

    PubMed

    Kuang, Cui-ping; Wu, Xiao-liang; Chen, Wu-shen; Wu, Fei-fei; Zhuo, Fei

    2015-02-01

    To understand the prevalence and risk factors of Enterobius vermicularis among preschool children in kindergartens in Luohu District, Shenzhen City. A total of 489 children in 6 kindergartens were selected by the stratified sampling method and investigated for E. vermicularis infection by the cellophane anal swab technique. The information of sanitary condition of the kindergartens, personal hygiene, and family hygiene were investigated by questionnaire. The infection rate of E. vermicularis was 10.2% (50/489). The single factor analysis indicated that the following factors might related to the infection: the different classes of kindergartens, grades, ground of bed ioom, private toilet, types of taps and beds, bed management, education levels of parents, frequency of shower and washing anus, and washing hands before meal and after WC. The multivariate Logistic analysis indicated that the bed management, education level of mothers, frequency of washing anus, and private toilet were independent risk factors for E. vermicularis infection. To control the infection of E. vermicularis, the circumstance and management of kindergartens, parents' knowledge of E. vernicularis infection, and children's healthy habit need improve.

  5. Digression and Value Concatenation to Enable Privacy-Preserving Regression.

    PubMed

    Li, Xiao-Bai; Sarkar, Sumit

    2014-09-01

    Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.

  6. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  7. Biotechnological advances in the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp.

    PubMed

    Zhao, Guang-Hui; Li, Juan; Blair, David; Li, Xiao-Yan; Elsheikha, Hany M; Lin, Rui-Qing; Zou, Feng-Cai; Zhu, Xing-Quan

    2012-01-01

    Schistosomiasis is a serious parasitic disease caused by blood-dwelling flukes of the genus Schistosoma. Throughout the world, schistosomiasis is associated with high rates of morbidity and mortality, with close to 800 million people at risk of infection. Precise methods for identification of Schistosoma species and diagnosis of schistosomiasis are crucial for an enhanced understanding of parasite epidemiology that informs effective antiparasitic treatment and preventive measures. Traditional approaches for the diagnosis of schistosomiasis include etiological, immunological and imaging techniques. Diagnosis of schistosomiasis has been revolutionized by the advent of new molecular technologies to amplify parasite nucleic acids. Among these, polymerase chain reaction-based methods have been useful in the analysis of genetic variation among Schistosoma spp. Mass spectrometry is now extending the range of biological molecules that can be detected. In this review, we summarize traditional, non-DNA-based diagnostic methods and then describe and discuss the current and developing molecular techniques for the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp. These exciting techniques provide foundations for further development of more effective and precise approaches to differentiate schistosomes and diagnose schistosomiasis in the clinic, and also have important implication for exploring novel measures to control schistosomiasis in the near future. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Cost-Loss Analysis of Ensemble Solar Wind Forecasting: Space Weather Use of Terrestrial Weather Tools

    NASA Astrophysics Data System (ADS)

    Henley, E. M.; Pope, E. C. D.

    2017-12-01

    This commentary concerns recent work on solar wind forecasting by Owens and Riley (2017). The approach taken makes effective use of tools commonly used in terrestrial weather—notably, via use of a simple model—generation of an "ensemble" forecast, and application of a "cost-loss" analysis to the resulting probabilistic information, to explore the benefit of this forecast to users with different risk appetites. This commentary aims to highlight these useful techniques to the wider space weather audience and to briefly discuss the general context of application of terrestrial weather approaches to space weather.

  9. Arthrodesis following failed total knee arthroplasty: comprehensive review and meta-analysis of recent literature.

    PubMed

    Damron, T A; McBeath, A A

    1995-04-01

    With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.

  10. Laparoscopic cholecystectomy poses physical injury risk to surgeons: analysis of hand technique and standing position.

    PubMed

    Youssef, Yassar; Lee, Gyusung; Godinez, Carlos; Sutton, Erica; Klein, Rosemary V; George, Ivan M; Seagull, F Jacob; Park, Adrian

    2011-07-01

    This study compares surgical techniques and surgeon's standing position during laparoscopic cholecystectomy (LC), investigating each with respect to surgeons' learning, performance, and ergonomics. Little homogeneity exists in LC performance and training. Variations in standing position (side-standing technique vs. between-standing technique) and hand technique (one-handed vs. two-handed) exist. Thirty-two LC procedures performed on a virtual reality simulator were video-recorded and analyzed. Each subject performed four different procedures: one-handed/side-standing, one-handed/between-standing, two-handed/side-standing, and two-handed/between-standing. Physical ergonomics were evaluated using Rapid Upper Limb Assessment (RULA). Mental workload assessment was acquired with the National Aeronautics and Space Administration-Task Load Index (NASA-TLX). Virtual reality (VR) simulator-generated performance evaluation and a subjective survey were analyzed. RULA scores were consistently lower (indicating better ergonomics) for the between-standing technique and higher (indicating worse ergonomics) for the side-standing technique, regardless of whether one- or two-handed. Anatomical scores overall showed side-standing to have a detrimental effect on the upper arms and trunk. The NASA-TLX showed significant association between the side-standing position and high physical demand, effort, and frustration (p<0.05). The two-handed technique in the side-standing position required more effort than the one-handed (p<0.05). No difference in operative time or complication rate was demonstrated among the four procedures. The two-handed/between-standing method was chosen as the best procedure to teach and standardize. Laparoscopic cholecystectomy poses a risk of physical injury to the surgeon. As LC is currently commonly performed in the United States, the left side-standing position may lead to increased physical demand and effort, resulting in ergonomically unsound conditions for the surgeon. Though further investigations should be conducted, adopting the between-standing position deserves serious consideration as it may be the best short-term ergonomic alternative.

  11. Complications after pectus excavatum repair using pectus bars in adolescents and adults: risk comparisons between age and technique groups.

    PubMed

    Choi, Soohwan; Park, Hyung Joo

    2017-10-01

    To compare the complications associated with age and technique groups in patients undergoing pectus excavatum (PE) repair. The data of 994 patients who underwent PE repair from March 2011 to December 2015 were retrospectively reviewed. Mean age was 9.59 years (range 31 months-55 years), and 756 patients were men (76.1%). The age groups were defined as follows: Group 1, <5 years; Group 2, 5-9 years; Group 3, 10-14 years; Group 4, 15-17 years; Group 5, 18-19 years; Group 6, 20-24 years; and Group 7, >24 years. The technique groups were defined as follows: Group 1, patients who underwent repair with claw fixators and hinge plates; Group 2, patients who underwent repair with our 'bridge' technique. Complications were compared between age groups and technique groups. No cases of mortality occurred. Complication rates in the age groups 1-7 were 5.4%, 3.6%, 12.1%, 18.2%, 17.3%, 13.9% and 16.7%, respectively. The complication rate tripled after the age of 10. In multivariable analysis, odds ratio of Groups 4, 5 and 7 and asymmetric types were 3.04, 2.81, 2.97 and 1.70 (P < 0.01, P = 0.02, 0.03 and 0.03, respectively). The bar dislocation rate in technique Group 1 was 0.8% (6 of 780). No bar dislocations occurred in technique Group 2. Older patients have more asymmetric pectus deformity and they are also risk factors for complications following PE repair. The bridge technique provides a bar dislocation rate of 0%, even in adult patients. This procedure seems to reduce or prevent major complications following PE repair. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  12. Cost-Effectiveness Research in Neurosurgery: We Can and We Must.

    PubMed

    Stein, Sherman C

    2018-01-05

    Rapid advancement of medical and surgical therapies, coupled with the recent preoccupation with limiting healthcare costs, makes a collision of the 2 objectives imminent. This article explains the value of cost-effectiveness analysis (CEA) in reconciling the 2 competing goals, and provides a brief introduction to evidence-based CEA techniques. The historical role of CEA in determining whether new neurosurgical strategies provide value for cost is summarized briefly, as are the limitations of the technique. Finally, the unique ability of the neurosurgical community to provide input to the CEA process is emphasized, as are the potential risks of leaving these important decisions in the hands of others. Copyright © 2018 by the Congress of Neurological Surgeons.

  13. Biomagnetic separation of Salmonella Typhimurium with high affine and specific ligand peptides isolated by phage display technique

    NASA Astrophysics Data System (ADS)

    Steingroewer, Juliane; Bley, Thomas; Bergemann, Christian; Boschke, Elke

    2007-04-01

    Analyses of food-borne pathogens are of great importance in order to minimize the health risk for customers. Thus, very sensitive and rapid detection methods are required. Current conventional culture techniques are very time consuming. Modern immunoassays and biochemical analysis also require pre-enrichment steps resulting in a turnaround time of at least 24 h. Biomagnetic separation (BMS) is a promising more rapid method. In this study we describe the isolation of high affine and specific peptides from a phage-peptide library, which combined with BMS allows the detection of Salmonella spp. with a similar sensitivity as that of immunomagnetic separation using antibodies.

  14. Measuring and managing risk improves strategic financial planning.

    PubMed

    Kleinmuntz, D N; Kleinmuntz, C E; Stephen, R G; Nordlund, D S

    1999-06-01

    Strategic financial risk assessment is a practical technique that can enable healthcare strategic decision makers to perform quantitative analyses of the financial risks associated with a given strategic initiative. The technique comprises six steps: (1) list risk factors that might significantly influence the outcomes, (2) establish best-guess estimates for assumptions regarding how each risk factor will affect its financial outcomes, (3) identify risk factors that are likely to have the greatest impact, (4) assign probabilities to assumptions, (5) determine potential scenarios associated with combined assumptions, and (6) determine the probability-weighted average of the potential scenarios.

  15. Fire Risk Assessment of Some Indian Coals Using Radial Basis Function (RBF) Technique

    NASA Astrophysics Data System (ADS)

    Nimaje, Devidas; Tripathy, Debi Prasad

    2017-04-01

    Fires, whether surface or underground, pose serious and environmental problems in the global coal mining industry. It is causing huge loss of coal due to burning and loss of lives, sterilization of coal reserves and environmental pollution. Most of the instances of coal mine fires happening worldwide are mainly due to the spontaneous combustion. Hence, attention must be paid to take appropriate measures to prevent occurrence and spread of fire. In this paper, to evaluate the different properties of coals for fire risk assessment, forty-nine in situ coal samples were collected from major coalfields of India. Intrinsic properties viz. proximate and ultimate analysis; and susceptibility indices like crossing point temperature, flammability temperature, Olpinski index and wet oxidation potential method of Indian coals were carried out to ascertain the liability of coal to spontaneous combustion. Statistical regression analysis showed that the parameters of ultimate analysis provide significant correlation with all investigated susceptibility indices as compared to the parameters of proximate analysis. Best correlated parameters (ultimate analysis) were used as inputs to the radial basis function network model. The model revealed that Olpinski index can be used as a reliable method to assess the liability of Indian coals to spontaneous combustion.

  16. Periodontitis and risk of psoriasis: a systematic review and meta-analysis.

    PubMed

    Ungprasert, P; Wijarnpreecha, K; Wetter, D A

    2017-05-01

    The association between periodontitis and systemic diseases has been increasingly recognized. However, the data on the association between periodontitis and psoriasis are still limited. To summarize all available data on the association between periodontitis and the risk of psoriasis. Two investigators independently searched published studies indexed in MEDLINE and EMBASE databases from inception to July 2016 using a search strategy that included terms for psoriasis and periodontitis. Studies were included if the following criteria were met: (i) case-control or cohort study comparing the risk of psoriasis in subjects with and without periodontitis; (ii) subjects without periodontitis were used as comparators in cohort studies while participants without psoriasis were used as controls in case-control studies; and (iii) effect estimates and 95% confidence intervals (CI) were provided. Point estimates and standard errors from each study were extracted and combined together using the generic inverse variance technique described by DerSimonian and Laird. Two cohort studies and three case-control studies met the inclusion criteria and were included in the meta-analysis. The pooled risk ratio of psoriasis in patients with periodontitis versus comparators was 1.55 (95% CI, 1.35-1.77). The statistical heterogeneity was insignificant with an I 2 of 18%. Subgroup analysis according to study design revealed a significantly higher risk among patients with periodontitis with a pooled RR of 1.50 (95% CI, 1.37-1.64) for cohort studies and a pooled RR of 2.33 (95% CI, 1.51-3.60) for case-control studies. Patients with periodontitis have a significantly elevated risk of psoriasis. © 2016 European Academy of Dermatology and Venereology.

  17. Risk of Gonadoblastoma Development in Patients with Turner Syndrome with Cryptic Y Chromosome Material.

    PubMed

    Kwon, Ahreum; Hyun, Sei Eun; Jung, Mo Kyung; Chae, Hyun Wook; Lee, Woo Jung; Kim, Tae Hyuk; Kim, Duk Hee; Kim, Ho-Seong

    2017-06-01

    Current guidelines recommend that testing for Y chromosome material should be performed only in patients with Turner syndrome harboring a marker chromosome and exhibiting virilization in order to detect individuals who are at high risk of gonadoblastoma. However, cryptic Y chromosome material is suggested to be a risk factor for gonadoblastoma in patients with Turner syndrome. Here, we aimed to estimate the frequency of cryptic Y chromosome material in patients with Turner syndrome and determine whether Y chromosome material increased the risk for development of gonadoblastoma. A total of 124 patients who were diagnosed with Turner syndrome by conventional cytogenetic techniques underwent additional molecular analysis to detect cryptic Y chromosome material. In addition, patients with Turner syndrome harboring Y chromosome cell lines had their ovaries removed prophylactically. Finally, we assessed the occurrence of gonadoblastoma in patients with Turner syndrome. Molecular analysis demonstrated that 10 patients had Y chromosome material among 118 patients without overt Y chromosome (8.5%). Six patients with overt Y chromosome and four patients with cryptic Y chromosome material underwent oophorectomy. Histopathological analysis revealed that the occurrence of gonadoblastoma in the total group was 2.4%, and gonadoblastoma occurred in one of six patients with an overt Y chromosome (16.7%) and 2 of 10 patients with cryptic Y chromosome material (20.0%). The risk of developing gonadoblastoma in patients with cryptic Y chromosome material was similar to that in patients with overt Y chromosome. Therefore, molecular screening for Y chromosome material should be recommended for all patients with Turner syndrome to detect individuals at a high risk of gonadoblastoma and to facilitate proper management of the disease.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  19. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  20. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    NASA Astrophysics Data System (ADS)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

Top