Sample records for complication probability modeling

  1. Normal tissue complication probability modelling of tissue fibrosis following breast radiotherapy

    NASA Astrophysics Data System (ADS)

    Alexander, M. A. R.; Brooks, W. A.; Blake, S. W.

    2007-04-01

    Cosmetic late effects of radiotherapy such as tissue fibrosis are increasingly regarded as being of importance. It is generally considered that the complication probability of a radiotherapy plan is dependent on the dose uniformity, and can be reduced by using better compensation to remove dose hotspots. This work aimed to model the effects of improved dose homogeneity on complication probability. The Lyman and relative seriality NTCP models were fitted to clinical fibrosis data for the breast collated from the literature. Breast outlines were obtained from a commercially available Rando phantom using the Osiris system. Multislice breast treatment plans were produced using a variety of compensation methods. Dose-volume histograms (DVHs) obtained for each treatment plan were reduced to simple numerical parameters using the equivalent uniform dose and effective volume DVH reduction methods. These parameters were input into the models to obtain complication probability predictions. The fitted model parameters were consistent with a parallel tissue architecture. Conventional clinical plans generally showed reducing complication probabilities with increasing compensation sophistication. Extremely homogenous plans representing idealized IMRT treatments showed increased complication probabilities compared to conventional planning methods, as a result of increased dose to areas receiving sub-prescription doses using conventional techniques.

  2. Effect of Cisplatin on Parotid Gland Function in Concomitant Radiochemotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hey, Jeremias; Setz, Juergen; Gerlach, Reinhard

    2009-12-01

    Purpose: To determine the influence of concomitant radiochemotherapy with cisplatin on parotid gland tissue complication probability. Methods and Materials: Patients treated with either radiotherapy (n = 61) or concomitant radiochemotherapy with cisplatin (n = 36) for head-and-neck cancer were prospectively evaluated. The dose and volume distributions of the parotid glands were noted in dose-volume histograms. Stimulated salivary flow rates were measured before, during the 2nd and 6th weeks and at 4 weeks and 6 months after the treatment. The data were fit using the normal tissue complication probability model of Lyman. Complication was defined as a reduction of the salivarymore » flow rate to less than 25% of the pretreatment flow rate. Results: The normal tissue complication probability model parameter TD{sub 50} (the dose leading to a complication probability of 50%) was found to be 32.2 Gy at 4 weeks and 32.1 Gy at 6 months for concomitant radiochemotherapy and 41.1 Gy at 4 weeks and 39.6 Gy at 6 months for radiotherapy. The tolerated dose for concomitant radiochemotherapy was at least 7 to 8 Gy lower than for radiotherapy alone at TD{sub 50}. Conclusions: In this study, the concomitant radiochemotherapy tended to cause a higher probability of parotid gland tissue damage. Advanced radiotherapy planning approaches such as intensity-modulated radiotherapy may be partiticularly important for parotid sparing in radiochemotherapy because of cisplatin-related increased radiosensitivity of glands.« less

  3. A Probabilistic Model of Illegal Drug Trafficking Operations in the Eastern Pacific and Caribbean Sea

    DTIC Science & Technology

    2013-09-01

    partner agencies and nations, detects, tracks, and interdicts illegal drug-trafficking in this region. In this thesis, we develop a probability model based...trafficking in this region. In this thesis, we develop a probability model based on intelligence inputs to generate a spatial temporal heat map specifying the...complement and vet such complicated simulation by developing more analytically tractable models. We develop probability models to generate a heat map

  4. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Exact Tests for the Rasch Model via Sequential Importance Sampling

    ERIC Educational Resources Information Center

    Chen, Yuguo; Small, Dylan

    2005-01-01

    Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…

  6. Survival analysis of cervical cancer using stratified Cox regression

    NASA Astrophysics Data System (ADS)

    Purnami, S. W.; Inayati, K. D.; Sari, N. W. Wulan; Chosuvivatwong, V.; Sriplung, H.

    2016-04-01

    Cervical cancer is one of the mostly widely cancer cause of the women death in the world including Indonesia. Most cervical cancer patients come to the hospital already in an advanced stadium. As a result, the treatment of cervical cancer becomes more difficult and even can increase the death's risk. One of parameter that can be used to assess successfully of treatment is the probability of survival. This study raises the issue of cervical cancer survival patients at Dr. Soetomo Hospital using stratified Cox regression based on six factors such as age, stadium, treatment initiation, companion disease, complication, and anemia. Stratified Cox model is used because there is one independent variable that does not satisfy the proportional hazards assumption that is stadium. The results of the stratified Cox model show that the complication variable is significant factor which influent survival probability of cervical cancer patient. The obtained hazard ratio is 7.35. It means that cervical cancer patient who has complication is at risk of dying 7.35 times greater than patient who did not has complication. While the adjusted survival curves showed that stadium IV had the lowest probability of survival.

  7. Optimizing the parameters of the Lyman-Kutcher-Burman, Källman, and Logit+EUD models for the rectum - a comparison between normal tissue complication probability and clinical data

    NASA Astrophysics Data System (ADS)

    Trojková, Darina; Judas, Libor; Trojek, Tomáš

    2014-11-01

    Minimizing the late rectal toxicity of prostate cancer patients is a very important and widely-discussed topic. Normal tissue complication probability (NTCP) models can be used to evaluate competing treatment plans. In our work, the parameters of the Lyman-Kutcher-Burman (LKB), Källman, and Logit+EUD models are optimized by minimizing the Brier score for a group of 302 prostate cancer patients. The NTCP values are calculated and are compared with the values obtained using previously published values for the parameters. χ2 Statistics were calculated as a check of goodness of optimization.

  8. Normal tissue complication probability modeling of radiation-induced hypothyroidism after head-and-neck radiation therapy.

    PubMed

    Bakhshandeh, Mohsen; Hashemi, Bijan; Mahdavi, Seied Rabi Mehdi; Nikoofar, Alireza; Vasheghani, Maryam; Kazemnejad, Anoshirvan

    2013-02-01

    To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with α/β = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D(50) estimated from the models was approximately 44 Gy. The implemented normal tissue complication probability models showed a parallel architecture for the thyroid. The mean dose model can be used as the best model to describe the dose-response relationship for hypothyroidism complication. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. A stochastic model for the normal tissue complication probability (NTCP) and applicationss.

    PubMed

    Stocks, Theresa; Hillen, Thomas; Gong, Jiafen; Burger, Martin

    2017-12-11

    The normal tissue complication probability (NTCP) is a measure for the estimated side effects of a given radiation treatment schedule. Here we use a stochastic logistic birth-death process to define an organ-specific and patient-specific NTCP. We emphasize an asymptotic simplification which relates the NTCP to the solution of a logistic differential equation. This framework is based on simple modelling assumptions and it prepares a framework for the use of the NTCP model in clinical practice. As example, we consider side effects of prostate cancer brachytherapy such as increase in urinal frequency, urinal retention and acute rectal dysfunction. © The authors 2016. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  10. Combining the LKB NTCP model with radiosensitivity parameters to characterize toxicity of radionuclides based on a multiclonogen kidney model: a theoretical assessment.

    PubMed

    Lin, Hui; Jing, Jia; Xu, Liangfeng; Wu, Dongsheng; Xu, Yuanying

    2012-06-01

    The Lyman-Kutcher-Burman (LKB) normal tissue complication probability (NTCP) model is often used to estimate the damage level to normal tissue. However, it does not manifestly involve the influence of radiosensitivity parameters. This work replaces the generalized mean equivalent uniform dose (gEUD) with the equivalent uniform dose (EUD) in the LKB model to investigate the effect of a variety of radiobiological parameters on the NTCP to characterize the toxicity of five types of radionuclides. The dose for 50 % complication probability (D (50)) is replaced by the corresponding EUD for 50 % complication probability (EUD(50)). The properties of a variety of radiobiological characteristics, such as biologically effective dose (BED), NTCP, and EUD, for five types of radioisotope ((131)I, (186)Re, (188)Re, (90)Y, and (67)Cu) are investigated by various radiosensitivity parameters such as intrinsic radiosensitivity α, alpha-beta ratio α/β, cell repair half-time, cell mean clonogen doubling time, etc. The high-energy beta emitters ((90)Y and (188)Re) have high initial dose rate and mean absorbed dose per injected activity in kidney, and their kidney toxicity should be of greater concern if they are excreted through kidneys. The radiobiological effect of (188)Re changes most sharply with the radiobiological parameters due to its high-energy electrons and very short physical half-life. The dose for a probability of 50% injury within 5y (D (50/5)) 28 Gy for whole-kidney irradiation should be adjusted according to different radionuclides and different radiosensitivity of individuals. The D (50/5) of individuals with low α/β or low α, or low biological clearance half-time, will be less than 28 Gy. The 50 % complication probability dose for (67)Cu and (188)Re could be 25 Gy and 22 Gy. The same mean absorbed dose generally corresponds to different degrees of damage for tissues of different radiosensitivity and different radionuclides. The influence of various radiobiological parameters should be taken into consideration in the NTCP model.

  11. Prediction of radiation-induced liver disease by Lyman normal-tissue complication probability model in three-dimensional conformal radiation therapy for primary liver carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu ZhiYong; Department of Oncology, Shanghai Medical School, Fudan University, Shanghai; Liang Shixiong

    Purpose: To describe the probability of RILD by application of the Lyman-Kutcher-Burman normal-tissue complication (NTCP) model for primary liver carcinoma (PLC) treated with hypofractionated three-dimensional conformal radiotherapy (3D-CRT). Methods and Materials: A total of 109 PLC patients treated by 3D-CRT were followed for RILD. Of these patients, 93 were in liver cirrhosis of Child-Pugh Grade A, and 16 were in Child-Pugh Grade B. The Michigan NTCP model was used to predict the probability of RILD, and then the modified Lyman NTCP model was generated for Child-Pugh A and Child-Pugh B patients by maximum-likelihood analysis. Results: Of all patients, 17 developedmore » RILD in which 8 were of Child-Pugh Grade A, and 9 were of Child-Pugh Grade B. The prediction of RILD by the Michigan model was underestimated for PLC patients. The modified n, m, TD{sub 5} (1) were 1.1, 0.28, and 40.5 Gy and 0.7, 0.43, and 23 Gy for patients with Child-Pugh A and B, respectively, which yielded better estimations of RILD probability. The hepatic tolerable doses (TD{sub 5}) would be MDTNL of 21 Gy and 6 Gy, respectively, for Child-Pugh A and B patients. Conclusions: The Michigan model was probably not fit to predict RILD in PLC patients. A modified Lyman NTCP model for RILD was recommended.« less

  12. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    PubMed

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  13. Volume effects of late term normal tissue toxicity in prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Bonta, Dacian Viorel

    Modeling of volume effects for treatment toxicity is paramount for optimization of radiation therapy. This thesis proposes a new model for calculating volume effects in gastro-intestinal and genito-urinary normal tissue complication probability (NTCP) following radiation therapy for prostate carcinoma. The radiobiological and the pathological basis for this model and its relationship to other models are detailed. A review of the radiobiological experiments and published clinical data identified salient features and specific properties a biologically adequate model has to conform to. The new model was fit to a set of actual clinical data. In order to verify the goodness of fit, two established NTCP models and a non-NTCP measure for complication risk were fitted to the same clinical data. The method of fit for the model parameters was maximum likelihood estimation. Within the framework of the maximum likelihood approach I estimated the parameter uncertainties for each complication prediction model. The quality-of-fit was determined using the Aikaike Information Criterion. Based on the model that provided the best fit, I identified the volume effects for both types of toxicities. Computer-based bootstrap resampling of the original dataset was used to estimate the bias and variance for the fitted parameter values. Computer simulation was also used to estimate the population size that generates a specific uncertainty level (3%) in the value of predicted complication probability. The same method was used to estimate the size of the patient population needed for accurate choice of the model underlying the NTCP. The results indicate that, depending on the number of parameters of a specific NTCP model, 100 (for two parameter models) and 500 patients (for three parameter models) are needed for accurate parameter fit. Correlation of complication occurrence in patients was also investigated. The results suggest that complication outcomes are correlated in a patient, although the correlation coefficient is rather small.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, Susan L.; Liu, H. Helen; Wang, Shulian

    Purpose: The aim of this study was to investigate the effect of radiation dose distribution in the lung on the risk of postoperative pulmonary complications among esophageal cancer patients. Methods and Materials: We analyzed data from 110 patients with esophageal cancer treated with concurrent chemoradiotherapy followed by surgery at our institution from 1998 to 2003. The endpoint for analysis was postsurgical pneumonia or acute respiratory distress syndrome. Dose-volume histograms (DVHs) and dose-mass histograms (DMHs) for the whole lung were used to fit normal-tissue complication probability (NTCP) models, and the quality of fits were compared using bootstrap analysis. Results: Normal-tissue complicationmore » probability modeling identified that the risk of postoperative pulmonary complications was most significantly associated with small absolute volumes of lung spared from doses {>=}5 Gy (VS5), that is, exposed to doses <5 Gy. However, bootstrap analysis found no significant difference between the quality of this model and fits based on other dosimetric parameters, including mean lung dose, effective dose, and relative volume of lung receiving {>=}5 Gy, probably because of correlations among these factors. The choice of DVH vs. DMH or the use of fractionation correction did not significantly affect the results of the NTCP modeling. The parameter values estimated for the Lyman NTCP model were as follows (with 95% confidence intervals in parentheses): n = 1.85 (0.04, {infinity}), m = 0.55 (0.22, 1.02), and D {sub 5} = 17.5 Gy (9.4 Gy, 102 Gy). Conclusions: In this cohort of esophageal cancer patients, several dosimetric parameters including mean lung dose, effective dose, and absolute volume of lung receiving <5 Gy provided similar descriptions of the risk of postoperative pulmonary complications as a function of Radiation dose distribution in the lung.« less

  15. Tailoring the operative approach for appendicitis to the patient: a prediction model from national surgical quality improvement program data.

    PubMed

    Senekjian, Lara; Nirula, Raminder

    2013-01-01

    Laparoscopic appendectomy (LA) is increasingly being performed in the United States, despite controversy about differences in infectious complication rates compared with open appendectomy (OA). Subpopulations exist in which infectious complication rates, both surgical site and organ space, differ with respect to LA compared with OA. All appendectomies in the National Surgical Quality Improvement Program database were analyzed with respect to surgical site infection (SSI) and organ space infection (OSI). Multivariate logistic regression analysis identified independent predictors of SSI or OSI. Probabilities of SSI or OSI were determined for subpopulations to identify when LA was superior to OA. From 2005 to 2009, there were 61,830 appendectomies performed (77.5% LA), of which 9,998 (16.2%) were complicated (58.7% LA). The risk of SSI was considerably lower for LA in both noncomplicated and complicated appendicitis. Across all ages, body mass index, renal function, and WBCs, LA was associated with a lower probability of SSI. The risk of OSI was considerably greater for LA in both noncomplicated and complicated appendicitis. In complicated appendicitis, OA was associated with a lower probability of OSI in patients with WBC >12 cells × 10(3)/μL. In noncomplicated appendicitis, OA was associated with a lower probability of OSI in patients with a body mass index <37.5 when compared with LA. Subpopulations exist in which OA is superior to LA in terms of OSI, however, SSI is consistently lower in LA patients. Copyright © 2013 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  16. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD).

    PubMed

    Luxton, Gary; Keall, Paul J; King, Christopher R

    2008-01-07

    To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within approximately 0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD(50), and conversely m and TD(50) are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d(ref), n, v(eff) and the Niemierko equivalent uniform dose (EUD), where d(ref) and v(eff) are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data.

  17. Normal Tissue Complication Probability Modeling of Radiation-Induced Hypothyroidism After Head-and-Neck Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakhshandeh, Mohsen; Hashemi, Bijan, E-mail: bhashemi@modares.ac.ir; Mahdavi, Seied Rabi Mehdi

    Purpose: To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Methods and Materials: Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-basedmore » treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with {alpha}/{beta} = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Results: Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D{sub 50} estimated from the models was approximately 44 Gy. Conclusions: The implemented normal tissue complication probability models showed a parallel architecture for the thyroid. The mean dose model can be used as the best model to describe the dose-response relationship for hypothyroidism complication.« less

  18. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose volume outcome relationships

    NASA Astrophysics Data System (ADS)

    El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.

    2006-11-01

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  20. Prediction of major complications after hepatectomy using liver stiffness values determined by magnetic resonance elastography.

    PubMed

    Sato, N; Kenjo, A; Kimura, T; Okada, R; Ishigame, T; Kofunato, Y; Shimura, T; Abe, K; Ohira, H; Marubashi, S

    2018-04-23

    Liver fibrosis is a risk factor for hepatectomy but cannot be determined accurately before hepatectomy because diagnostic procedures are too invasive. Magnetic resonance elastography (MRE) can determine liver stiffness (LS), a surrogate marker for assessing liver fibrosis, non-invasively. The aim of this study was to investigate whether the LS value determined by MRE is predictive of major complications after hepatectomy. This prospective study enrolled consecutive patients who underwent hepatic resection between April 2013 and August 2016. LS values were measured by imaging shear waves by MRE in the liver before hepatectomy. The primary endpoint was major complications, defined as Clavien-Dindo grade IIIa or above. Logistic regression analysis identified independent predictive factors, from which a logistic model to estimate the probability of major complications was constructed. A total of 96 patients were included in the study. Major complications were observed in 15 patients (16 per cent). Multivariable logistic analysis confirmed that higher LS value (P = 0·021) and serum albumin level (P = 0·009) were independent predictive factors for major complications after hepatectomy. Receiver operating characteristic (ROC) analysis showed that the best LS cut-off value was 4·3 kPa for detecting major complications, comparable to liver fibrosis grade F4, with a sensitivity of 80 per cent and specificity of 82 per cent. A logistic model using the LS value and serum albumin level to estimate the probability of major complications was constructed; the area under the ROC curve for predicting major complications was 0·84. The LS value determined by MRE in patients undergoing hepatectomy was an independent predictive factor for major complications. © 2018 BJS Society Ltd Published by John Wiley & Sons Ltd.

  1. Prediction of radiation-induced normal tissue complications in radiotherapy using functional image data

    NASA Astrophysics Data System (ADS)

    Nioutsikou, Elena; Partridge, Mike; Bedford, James L.; Webb, Steve

    2005-03-01

    The aim of this study has been to explicitly include the functional heterogeneity of an organ as a factor that contributes to the probability of complication of normal tissues following radiotherapy. Situations for which the inclusion of this information can be advantageous to the design of treatment plans are then investigated. A Java program has been implemented for this purpose. This makes use of a voxelated model of a patient, which is based on registered anatomical and functional data in order to enable functional voxel weighting. Using this model, the functional dose-volume histogram (fDVH) and the functional normal tissue complication probability (fNTCP) are then introduced as extensions to the conventional dose-volume histogram (DVH) and normal tissue complication probability (NTCP). In the presence of functional heterogeneity, these tools are physically more meaningful for plan evaluation than the traditional indices, as they incorporate additional information and are anticipated to show a better correlation with outcome. New parameters mf, nf and TD50f are required to replace the m, n and TD50 parameters. A range of plausible values was investigated, awaiting fitting of these new parameters to patient outcomes where functional data have been measured. As an example, the model is applied to two lung datasets utilizing accurately registered computed tomography (CT) and single photon emission computed tomography (SPECT) perfusion scans. Assuming a linear perfusion-function relationship, the biological index mean perfusion weighted lung dose (MPWLD) has been extracted from integration over outlined regions of interest. In agreement with the MPWLD ranking, the fNTCP predictions reveal that incorporation of functional imaging in radiotherapy treatment planning is most beneficial for organs with a large volume effect and large focal areas of dysfunction. There is, however, no additional advantage in cases presenting with homogeneous function. Although presented for lung radiotherapy, this model is general. It can also be applied to positron emission tomography (PET)-CT or functional magnetic resonance imaging (fMRI)-CT registered data and extended to the functional description of tumour control probability.

  2. A representation of an NTCP function for local complication mechanisms

    NASA Astrophysics Data System (ADS)

    Alber, M.; Nüsslin, F.

    2001-02-01

    A mathematical formalism was tailored for the description of mechanisms complicating radiation therapy with a predominantly local component. The functional representation of an NTCP function was developed based on the notion that it has to be robust against population averages in order to be applicable to experimental data. The model was required to be invariant under scaling operations of the dose and the irradiated volume. The NTCP function was derived from the model assumptions that the complication is a consequence of local tissue damage and that the probability of local damage in a small reference volume is independent of the neighbouring volumes. The performance of the model was demonstrated with an animal model which has been published previously (Powers et al 1998 Radiother. Oncol. 46 297-306).

  3. Functional Data Analysis in NTCP Modeling: A New Method to Explore the Radiation Dose-Volume Effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benadjaoud, Mohamed Amine, E-mail: mohamedamine.benadjaoud@gustaveroussy.fr; Université Paris sud, Le Kremlin-Bicêtre; Institut Gustave Roussy, Villejuif

    2014-11-01

    Purpose/Objective(s): To describe a novel method to explore radiation dose-volume effects. Functional data analysis is used to investigate the information contained in differential dose-volume histograms. The method is applied to the normal tissue complication probability modeling of rectal bleeding (RB) for patients irradiated in the prostatic bed by 3-dimensional conformal radiation therapy. Methods and Materials: Kernel density estimation was used to estimate the individual probability density functions from each of the 141 rectum differential dose-volume histograms. Functional principal component analysis was performed on the estimated probability density functions to explore the variation modes in the dose distribution. The functional principalmore » components were then tested for association with RB using logistic regression adapted to functional covariates (FLR). For comparison, 3 other normal tissue complication probability models were considered: the Lyman-Kutcher-Burman model, logistic model based on standard dosimetric parameters (LM), and logistic model based on multivariate principal component analysis (PCA). Results: The incidence rate of grade ≥2 RB was 14%. V{sub 65Gy} was the most predictive factor for the LM (P=.058). The best fit for the Lyman-Kutcher-Burman model was obtained with n=0.12, m = 0.17, and TD50 = 72.6 Gy. In PCA and FLR, the components that describe the interdependence between the relative volumes exposed at intermediate and high doses were the most correlated to the complication. The FLR parameter function leads to a better understanding of the volume effect by including the treatment specificity in the delivered mechanistic information. For RB grade ≥2, patients with advanced age are significantly at risk (odds ratio, 1.123; 95% confidence interval, 1.03-1.22), and the fits of the LM, PCA, and functional principal component analysis models are significantly improved by including this clinical factor. Conclusion: Functional data analysis provides an attractive method for flexibly estimating the dose-volume effect for normal tissues in external radiation therapy.« less

  4. Radiobiological Impact of Reduced Margins and Treatment Technique for Prostate Cancer in Terms of Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Ingelise, E-mail: inje@rn.d; Carl, Jesper; Lund, Bente

    2011-07-01

    Dose escalation in prostate radiotherapy is limited by normal tissue toxicities. The aim of this study was to assess the impact of margin size on tumor control and side effects for intensity-modulated radiation therapy (IMRT) and 3D conformal radiotherapy (3DCRT) treatment plans with increased dose. Eighteen patients with localized prostate cancer were enrolled. 3DCRT and IMRT plans were compared for a variety of margin sizes. A marker detectable on daily portal images was presupposed for narrow margins. Prescribed dose was 82 Gy within 41 fractions to the prostate clinical target volume (CTV). Tumor control probability (TCP) calculations based on themore » Poisson model including the linear quadratic approach were performed. Normal tissue complication probability (NTCP) was calculated for bladder, rectum and femoral heads according to the Lyman-Kutcher-Burman method. All plan types presented essentially identical TCP values and very low NTCP for bladder and femoral heads. Mean doses for these critical structures reached a minimum for IMRT with reduced margins. Two endpoints for rectal complications were analyzed. A marked decrease in NTCP for IMRT plans with narrow margins was seen for mild RTOG grade 2/3 as well as for proctitis/necrosis/stenosis/fistula, for which NTCP <7% was obtained. For equivalent TCP values, sparing of normal tissue was demonstrated with the narrow margin approach. The effect was more pronounced for IMRT than 3DCRT, with respect to NTCP for mild, as well as severe, rectal complications.« less

  5. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA.

    PubMed

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-06-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models.

  6. Rectal bleeding, fecal incontinence, and high stool frequency after conformal radiotherapy for prostate cancer: Normal tissue complication probability modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeters, Stephanie; Hoogeman, Mischa S.; Heemsbergen, Wilma D.

    2006-09-01

    Purpose: To analyze whether inclusion of predisposing clinical features in the Lyman-Kutcher-Burman (LKB) normal tissue complication probability (NTCP) model improves the estimation of late gastrointestinal toxicity. Methods and Materials: This study includes 468 prostate cancer patients participating in a randomized trial comparing 68 with 78 Gy. We fitted the probability of developing late toxicity within 3 years (rectal bleeding, high stool frequency, and fecal incontinence) with the original, and a modified LKB model, in which a clinical feature (e.g., history of abdominal surgery) was taken into account by fitting subset specific TD50s. The ratio of these TD50s is the dose-modifyingmore » factor for that clinical feature. Dose distributions of anorectal (bleeding and frequency) and anal wall (fecal incontinence) were used. Results: The modified LKB model gave significantly better fits than the original LKB model. Patients with a history of abdominal surgery had a lower tolerance to radiation than did patients without previous surgery, with a dose-modifying factor of 1.1 for bleeding and of 2.5 for fecal incontinence. The dose-response curve for bleeding was approximately two times steeper than that for frequency and three times steeper than that for fecal incontinence. Conclusions: Inclusion of predisposing clinical features significantly improved the estimation of the NTCP. For patients with a history of abdominal surgery, more severe dose constraints should therefore be used during treatment plan optimization.« less

  7. Radiobiological impact of reduced margins and treatment technique for prostate cancer in terms of tumor control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Jensen, Ingelise; Carl, Jesper; Lund, Bente; Larsen, Erik H; Nielsen, Jane

    2011-01-01

    Dose escalation in prostate radiotherapy is limited by normal tissue toxicities. The aim of this study was to assess the impact of margin size on tumor control and side effects for intensity-modulated radiation therapy (IMRT) and 3D conformal radiotherapy (3DCRT) treatment plans with increased dose. Eighteen patients with localized prostate cancer were enrolled. 3DCRT and IMRT plans were compared for a variety of margin sizes. A marker detectable on daily portal images was presupposed for narrow margins. Prescribed dose was 82 Gy within 41 fractions to the prostate clinical target volume (CTV). Tumor control probability (TCP) calculations based on the Poisson model including the linear quadratic approach were performed. Normal tissue complication probability (NTCP) was calculated for bladder, rectum and femoral heads according to the Lyman-Kutcher-Burman method. All plan types presented essentially identical TCP values and very low NTCP for bladder and femoral heads. Mean doses for these critical structures reached a minimum for IMRT with reduced margins. Two endpoints for rectal complications were analyzed. A marked decrease in NTCP for IMRT plans with narrow margins was seen for mild RTOG grade 2/3 as well as for proctitis/necrosis/stenosis/fistula, for which NTCP <7% was obtained. For equivalent TCP values, sparing of normal tissue was demonstrated with the narrow margin approach. The effect was more pronounced for IMRT than 3DCRT, with respect to NTCP for mild, as well as severe, rectal complications. Copyright © 2011 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  8. Prevalence of cardiovascular and respiratory complications following trauma in patients with obesity.

    PubMed

    Bell, Teresa; Stokes, Samantha; Jenkins, Peter C; Hatcher, LeRanna; Fecher, Alison M

    It is generally accepted that obesity puts patients at an increased risk for cardiovascular and respiratory complications after surgical procedures. However, in the setting of trauma, there have been mixed findings in regards to whether obesity increases the risk for additional complications. The aim of this study was to identify whether obese patients suffer an increased risk of cardiac and respiratory complications following traumatic injury. A retrospective analysis of 275,393 patients was conducted using the 2012 National Trauma Data Bank. Hierarchical regression modeling was performed to determine the probability of experiencing a cardiac or respiratory complication. Patients with obesity were at a significantly higher risk of cardiac and respiratory complications compared to patients without obesity [OR: 1.81; CI: 1.72-1.91]. Prevalence of cardiovascular and respiratory complications for patients with obesity was 12.6% compared to 5.2% for non-obese patients. Obesity is predictive of an increased risk for cardiovascular and respiratory complications following trauma. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Development of a normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism in nasopharyngeal carcinoma patients.

    PubMed

    Luo, Ren; Wu, Vincent W C; He, Binghui; Gao, Xiaoying; Xu, Zhenxi; Wang, Dandan; Yang, Zhining; Li, Mei; Lin, Zhixiong

    2018-05-18

    The objectives of this study were to build a normal tissue complication probability (NTCP) model of radiation-induced hypothyroidism (RHT) for nasopharyngeal carcinoma (NPC) patients and to compare it with other four published NTCP models to evaluate its efficacy. Medical notes of 174 NPC patients after radiotherapy were reviewed. Biochemical hypothyroidism was defined as an elevated level of serum thyroid-stimulating hormone (TSH) value with a normal or decreased level of serum free thyroxine (fT4) after radiotherapy. Logistic regression with leave-one-out cross-validation was performed to establish the NTCP model. Model performance was evaluated and compared by the area under the receiver operating characteristic curve (AUC) in our NPC cohort. With a median follow-up of 24 months, 39 (22.4%) patients developed biochemical hypothyroidism. Gender, chemotherapy, the percentage thyroid volume receiving more than 50 Gy (V 50 ), and the maximum dose of the pituitary (P max ) were identified as the most predictive factors for RHT. A NTCP model based on these four parameters were developed. The model comparison was made in our NPC cohort and our NTCP model performed better in RHT prediction than the other four models. This study developed a four-variable NTCP model for biochemical hypothyroidism in NPC patients post-radiotherapy. Our NTCP model for RHT presents a high prediction capability. This is a retrospective study without registration.

  10. Normal Tissue Complication Probability (NTCP) modeling of late rectal bleeding following external beam radiotherapy for prostate cancer: A Test of the QUANTEC-recommended NTCP model.

    PubMed

    Liu, Mitchell; Moiseenko, Vitali; Agranovich, Alexander; Karvat, Anand; Kwan, Winkle; Saleh, Ziad H; Apte, Aditya A; Deasy, Joseph O

    2010-10-01

    Validating a predictive model for late rectal bleeding following external beam treatment for prostate cancer would enable safer treatments or dose escalation. We tested the normal tissue complication probability (NTCP) model recommended in the recent QUANTEC review (quantitative analysis of normal tissue effects in the clinic). One hundred and sixty one prostate cancer patients were treated with 3D conformal radiotherapy for prostate cancer at the British Columbia Cancer Agency in a prospective protocol. The total prescription dose for all patients was 74 Gy, delivered in 2 Gy/fraction. 159 3D treatment planning datasets were available for analysis. Rectal dose volume histograms were extracted and fitted to a Lyman-Kutcher-Burman NTCP model. Late rectal bleeding (>grade 2) was observed in 12/159 patients (7.5%). Multivariate logistic regression with dose-volume parameters (V50, V60, V70, etc.) was non-significant. Among clinical variables, only age was significant on a Kaplan-Meier log-rank test (p=0.007, with an optimal cut point of 77 years). Best-fit Lyman-Kutcher-Burman model parameters (with 95% confidence intervals) were: n = 0.068 (0.01, +infinity); m =0.14 (0.0, 0.86); and TD50 = 81 (27, 136) Gy. The peak values fall within the 95% QUANTEC confidence intervals. On this dataset, both models had only modest ability to predict complications: the best-fit model had a Spearman's rank correlation coefficient of rs = 0.099 (p = 0.11) and area under the receiver operating characteristic curve (AUC) of 0.62; the QUANTEC model had rs=0.096 (p= 0.11) and a corresponding AUC of 0.61. Although the QUANTEC model consistently predicted higher NTCP values, it could not be rejected according to the χ(2) test (p = 0.44). Observed complications, and best-fit parameter estimates, were consistent with the QUANTEC-preferred NTCP model. However, predictive power was low, at least partly because the rectal dose distribution characteristics do not vary greatly within this patient cohort.

  11. Genetic risk profiling and gene signature modeling to predict risk of complications after IPAA.

    PubMed

    Sehgal, Rishabh; Berg, Arthur; Polinski, Joseph I; Hegarty, John P; Lin, Zhenwu; McKenna, Kevin J; Stewart, David B; Poritz, Lisa S; Koltun, Walter A

    2012-03-01

    Severe pouchitis and Crohn's disease-like complications are 2 adverse postoperative complications that confound the success of the IPAA in patients with ulcerative colitis. To date, approximately 83 single nucleotide polymorphisms within 55 genes have been associated with IBD. The aim of this study was to identify single-nucleotide polymorphisms that correlate with complications after IPAA that could be utilized in a gene signature fashion to predict postoperative complications and aid in preoperative surgical decision making. One hundred forty-two IPAA patients were retrospectively classified as "asymptomatic" (n = 104, defined as no Crohn's disease-like complications or severe pouchitis for at least 2 years after IPAA) and compared with a "severe pouchitis" group (n = 12, ≥ 4 episodes pouchitis per year for 2 years including the need for long-term therapy to maintain remission) and a "Crohn's disease-like" group (n = 26, presence of fistulae, pouch inlet stricture, proximal small-bowel disease, or pouch granulomata, occurring at least 6 months after surgery). Genotyping for 83 single-nucleotide polymorphisms previously associated with Crohn's disease and/or ulcerative colitis was performed on a customized Illumina genotyping platform. The top 2 single-nucleotide polymorphisms statistically identified as being independently associated with each of Crohn's disease-like and severe pouchitis were used in a multivariate logistic regression model. These single-nucleotide polymorphisms were then used to create probability equations to predict overall chance of a positive or negative outcome for that complication. The top 2 single-nucleotide polymorphisms for Crohn's disease-like complications were in the 10q21 locus and the gene for PTGER4 (p = 0.006 and 0.007), whereas for severe pouchitis it was NOD2 and TNFSF15 (p = 0.003 and 0.011). Probability equations suggested that the risk of these 2 complications greatly increased with increasing number of risk alleles, going as high as 92% for severe pouchitis and 65% for Crohn's disease-like complications. In this IPAA patient cohort, mutations in the 10q21 locus and the PTGER4 gene were associated with Crohn's disease-like complications, whereas mutations in NOD2 and TNFSF15 correlated with severe pouchitis. Preoperative genetic analysis and use of such gene signatures hold promise for improved preoperative surgical patient selection to minimize these IPAA complications.

  12. A NEW METHOD OF PEAK DETECTION FOR ANALYSIS OF COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY MASS SPECTROMETRY DATA*

    PubMed Central

    Kim, Seongho; Ouyang, Ming; Jeong, Jaesik; Shen, Changyu; Zhang, Xiang

    2014-01-01

    We develop a novel peak detection algorithm for the analysis of comprehensive two-dimensional gas chromatography time-of-flight mass spectrometry (GC×GC-TOF MS) data using normal-exponential-Bernoulli (NEB) and mixture probability models. The algorithm first performs baseline correction and denoising simultaneously using the NEB model, which also defines peak regions. Peaks are then picked using a mixture of probability distribution to deal with the co-eluting peaks. Peak merging is further carried out based on the mass spectral similarities among the peaks within the same peak group. The algorithm is evaluated using experimental data to study the effect of different cut-offs of the conditional Bayes factors and the effect of different mixture models including Poisson, truncated Gaussian, Gaussian, Gamma, and exponentially modified Gaussian (EMG) distributions, and the optimal version is introduced using a trial-and-error approach. We then compare the new algorithm with two existing algorithms in terms of compound identification. Data analysis shows that the developed algorithm can detect the peaks with lower false discovery rates than the existing algorithms, and a less complicated peak picking model is a promising alternative to the more complicated and widely used EMG mixture models. PMID:25264474

  13. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  14. Evaluation of high myopia complications prevention program in university freshmen

    PubMed Central

    Tseng, Gow-Lieng; Chen, Cheng-Yu

    2016-01-01

    Abstract High myopia is a global eye health problem because of its high incidence of sight-threatening complications. Due to the role of awareness, self-examination, and preventive behavior in prevention of morbidity of high myopia complications, promoting knowledge, capabilities, and attitude of high myopic personnel are required in this regard. In this quasi-experiment study, 31 freshmen with high myopia in a national university were enrolled in 2014. The data were collected by validated and reliable questionnaire based on health belief model (HBM) and self-efficacy theory. The intervention program consisted of 1 educational session lasting 150 minutes by lecturing of high myopia complications, virtual reality experiencing, similarity modeling, and quibbling a film made on high myopia complications preventive concepts. Implementing the educational program showed immediate effect in knowledge, perceived susceptibility, perceived severity, self-efficacy, and preventive behavior intention. While 6 weeks after the educational program, significant increases were observed in cues to action, self-efficacy, and preventive behavior intention. This article provided that, after a single session, there was positive improvement in high myopia complication prevention behavior intention among participants. These positive effects confirmed the efficacy of the education program and will probably induce behavior change. PMID:27749586

  15. Evaluation of high myopia complications prevention program in university freshmen.

    PubMed

    Tseng, Gow-Lieng; Chen, Cheng-Yu

    2016-10-01

    High myopia is a global eye health problem because of its high incidence of sight-threatening complications. Due to the role of awareness, self-examination, and preventive behavior in prevention of morbidity of high myopia complications, promoting knowledge, capabilities, and attitude of high myopic personnel are required in this regard.In this quasi-experiment study, 31 freshmen with high myopia in a national university were enrolled in 2014. The data were collected by validated and reliable questionnaire based on health belief model (HBM) and self-efficacy theory. The intervention program consisted of 1 educational session lasting 150 minutes by lecturing of high myopia complications, virtual reality experiencing, similarity modeling, and quibbling a film made on high myopia complications preventive concepts.Implementing the educational program showed immediate effect in knowledge, perceived susceptibility, perceived severity, self-efficacy, and preventive behavior intention. While 6 weeks after the educational program, significant increases were observed in cues to action, self-efficacy, and preventive behavior intention.This article provided that, after a single session, there was positive improvement in high myopia complication prevention behavior intention among participants. These positive effects confirmed the efficacy of the education program and will probably induce behavior change.

  16. Cost-Effectiveness Analysis of Universal Influenza Vaccination: Application of Susceptible-Infectious-Complication-Recovery Model.

    PubMed

    Yang, Kuen-Cheh; Hung, Hui-Fang; Chen, Meng-Kan; Chen, Li-Sheng; Fann, Jean Ching-Yuan; Chiu, Sherry Yueh-Hsia; Yen, Amy-Ming Fang; Huang, Kuo-Chin; Chen, Hsiu-Hsi; Wang, Sen-Te

    2018-06-12

    Despite the fact that vaccination is an effective primary prevention strategy for containing influenza outbreak, health policymakers show great concern over enormous costs involved in universal immunization particularly when resources are limited. We conducted a two-arm cost-effectiveness analysis (CEA) that takes into account the aspect of herd immunity, using a study cohort that was composed of 100,000 residents with the make-up of demographic characteristics identical to those of the underlying population in Taipei County, Taiwan, during the epidemic influenza season of 2001-2002. The parameters embedded in the dynamic process of infection were estimated by the application of the newly proposed susceptible-infection-complication-recovery model to the empirical data in order to compute the number of deaths and complications averted due to universal vaccination compared to non-vaccination. Incremental cost-effectiveness ratios (ICERs) and cost-effectiveness acceptability curve (CEAC) given maximum amount of willingness-to-pay (WTP) were calculated to delineate the results of the two-arm CEA. Incremental costs involved in the vaccinated group as opposed to the unvaccinated group was $1,195 for reducing one additional complication and $805 for averting one additional death, allowing for herd immunity. The corresponding figures were higher for the results without considering herd immunity. Given the ceiling ratio of willingness-to-pay (WTP) equal to $10,000 (approximately two-thirds of the GDP), the probability of being cost-effective for vaccination was 100% and 96.7% for averting death and complications, respectively. Universal vaccination against seasonal influenza was very cost-effective particularly when herd immunity is considered. The probability of being cost-effective was almost certain given the maximum amount of WTP within two-thirds of the GDP. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. WE-AB-202-01: Evaluating the Toxicity Reduction with CT-Ventilation Functional Avoidance Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinogradskiy, Y; Miyasaka, Y; Kadoya, N

    Purpose: CT-ventilation is an exciting new imaging modality that uses 4DCTs to calculate lung ventilation. Studies have proposed to use 4DCT-ventilation imaging for functional avoidance radiotherapy which implies designing treatment plans to spare functional portions of the lung. Although retrospective studies have been performed to evaluate the dosimetric gains to functional lung; no work has been done to translate the dosimetric gains to an improvement in pulmonary toxicity. The purpose of our work was to evaluate the potential reduction in toxicity for 4DCT-ventilation based functional avoidance. Methods: 70 lung cancer patients with 4DCT imaging were used for the study. CT-ventilationmore » maps were calculated using the patient’s 4DCT, deformable image registrations, and a density-change-based algorithm. Radiation pneumonitis was graded using imaging and clinical information. Log-likelihood methods were used to fit a normal-tissue-complication-probability (NTCP) model predicting grade 2+ radiation pneumonitis as a function of doses (mean and V20) to functional lung (>15% ventilation). For 20 patients a functional plan was generated that reduced dose to functional lung while meeting RTOG 0617-based constraints. The NTCP model was applied to the functional plan to determine the reduction in toxicity with functional planning Results: The mean dose to functional lung was 16.8 and 17.7 Gy with the functional and clinical plans respectively. The corresponding grade 2+ pneumonitis probability was 26.9% with the clinically-used plan and 24.6% with the functional plan (8.5% reduction). The V20-based grade 2+ pneumonitis probability was 23.7% with the clinically-used plan and reduced to 19.6% with the functional plan (20.9% reduction). Conclusion: Our results revealed a reduction of 9–20% in complication probability with functional planning. To our knowledge this is the first study to apply complication probability to convert dosimetric results to toxicity improvement. The results presented in the current work provide seminal data for prospective clinical trials in functional avoidance. YV discloses funding from State of Colorado. TY discloses National Lung Cancer Partnership; Young Investigator Research grant.« less

  18. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    PubMed Central

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP parameter modeling based on SEF and QoL data, which gave a NPV of 100% with each dataset, and the QUANTEC guidelines, thus validating the cut-off values of 20 and 25 Gy. Based on these results, we believe that the QUANTEC 25/20-Gy spared-gland mean-dose guidelines are clinically useful for avoiding xerostomia in the HN cohort. PMID:23206972

  19. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments.

    PubMed

    Lee, Tsair-Fwu; Chao, Pei-Ju; Wang, Hung-Yu; Hsu, Hsuan-Chih; Chang, PaoShu; Chen, Wen-Cheng

    2012-12-04

    With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman-Kutcher-Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson's chi-squared test, Nagelkerke's R2, the area under the receiver operating characteristic curve, and the Hosmer-Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson's chi-squared test was used to test the goodness of fit and association. Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose-response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Our study shows the agreement between the NTCP parameter modeling based on SEF and QoL data, which gave a NPV of 100% with each dataset, and the QUANTEC guidelines, thus validating the cut-off values of 20 and 25 Gy. Based on these results, we believe that the QUANTEC 25/20-Gy spared-gland mean-dose guidelines are clinically useful for avoiding xerostomia in the HN cohort.

  20. Normal Tissue Complication Probability (NTCP) Modelling of Severe Acute Mucositis using a Novel Oral Mucosal Surface Organ at Risk.

    PubMed

    Dean, J A; Welsh, L C; Wong, K H; Aleksic, A; Dunne, E; Islam, M R; Patel, A; Patel, P; Petkar, I; Phillips, I; Sham, J; Schick, U; Newbold, K L; Bhide, S A; Harrington, K J; Nutting, C M; Gulliford, S L

    2017-04-01

    A normal tissue complication probability (NTCP) model of severe acute mucositis would be highly useful to guide clinical decision making and inform radiotherapy planning. We aimed to improve upon our previous model by using a novel oral mucosal surface organ at risk (OAR) in place of an oral cavity OAR. Predictive models of severe acute mucositis were generated using radiotherapy dose to the oral cavity OAR or mucosal surface OAR and clinical data. Penalised logistic regression and random forest classification (RFC) models were generated for both OARs and compared. Internal validation was carried out with 100-iteration stratified shuffle split cross-validation, using multiple metrics to assess different aspects of model performance. Associations between treatment covariates and severe mucositis were explored using RFC feature importance. Penalised logistic regression and RFC models using the oral cavity OAR performed at least as well as the models using mucosal surface OAR. Associations between dose metrics and severe mucositis were similar between the mucosal surface and oral cavity models. The volumes of oral cavity or mucosal surface receiving intermediate and high doses were most strongly associated with severe mucositis. The simpler oral cavity OAR should be preferred over the mucosal surface OAR for NTCP modelling of severe mucositis. We recommend minimising the volume of mucosa receiving intermediate and high doses, where possible. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  1. Male greater sage-grouse movements among leks

    Treesearch

    Aleshia L. Fremgen; Christopher T. Rota; Christopher P. Hansen; Mark A. Rumble; R. Scott Gamo; Joshua J. Millspaugh

    2017-01-01

    Movements among leks by breeding birds (i.e., interlek movements) could affect the population's genetic flow, complicate use of lek counts as a population index, and indicate a change in breeding behavior following a disturbance. We used a Bayesian multi-state mark-recapture model to assess the daily probability of male greater sage-grouse (Centrocercus...

  2. Discounting of Monetary Rewards that are Both Delayed and Probabilistic: Delay and Probability Combine Multiplicatively, not Additively

    PubMed Central

    Vanderveldt, Ariana; Green, Leonard; Myerson, Joel

    2014-01-01

    The value of an outcome is affected both by the delay until its receipt (delay discounting) and by the likelihood of its receipt (probability discounting). Despite being well-described by the same hyperboloid function, delay and probability discounting involve fundamentally different processes, as revealed, for example, by the differential effects of reward amount. Previous research has focused on the discounting of delayed and probabilistic rewards separately, with little research examining more complex situations in which rewards are both delayed and probabilistic. In two experiments, participants made choices between smaller rewards that were both immediate and certain and larger rewards that were both delayed and probabilistic. Analyses revealed significant interactions between delay and probability factors inconsistent with an additive model. In contrast, a hyperboloid discounting model in which delay and probability were combined multiplicatively provided an excellent fit to the data. These results suggest that the hyperboloid is a good descriptor of decision making in complicated monetary choice situations like those people encounter in everyday life. PMID:24933696

  3. Probabilistic sensitivity analysis for decision trees with multiple branches: use of the Dirichlet distribution in a Bayesian framework.

    PubMed

    Briggs, Andrew H; Ades, A E; Price, Martin J

    2003-01-01

    In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.

  4. A spatial model of bird abundance as adjusted for detection probability

    USGS Publications Warehouse

    Gorresen, P.M.; Mcmillan, G.P.; Camp, R.J.; Pratt, T.K.

    2009-01-01

    Modeling the spatial distribution of animals can be complicated by spatial and temporal effects (i.e. spatial autocorrelation and trends in abundance over time) and other factors such as imperfect detection probabilities and observation-related nuisance variables. Recent advances in modeling have demonstrated various approaches that handle most of these factors but which require a degree of sampling effort (e.g. replication) not available to many field studies. We present a two-step approach that addresses these challenges to spatially model species abundance. Habitat, spatial and temporal variables were handled with a Bayesian approach which facilitated modeling hierarchically structured data. Predicted abundance was subsequently adjusted to account for imperfect detection and the area effectively sampled for each species. We provide examples of our modeling approach for two endemic Hawaiian nectarivorous honeycreepers: 'i'iwi Vestiaria coccinea and 'apapane Himatione sanguinea. ?? 2009 Ecography.

  5. Modeling late rectal toxicities based on a parameterized representation of the 3D dose distribution

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike

    2011-04-01

    Many models exist for predicting toxicities based on dose-volume histograms (DVHs) or dose-surface histograms (DSHs). This approach has several drawbacks as firstly the reduction of the dose distribution to a histogram results in the loss of spatial information and secondly the bins of the histograms are highly correlated with each other. Furthermore, some of the complex nonlinear models proposed in the past lack a direct physical interpretation and the ability to predict probabilities rather than binary outcomes. We propose a parameterized representation of the 3D distribution of the dose to the rectal wall which explicitly includes geometrical information in the form of the eccentricity of the dose distribution as well as its lateral and longitudinal extent. We use a nonlinear kernel-based probabilistic model to predict late rectal toxicity based on the parameterized dose distribution and assessed its predictive power using data from the MRC RT01 trial (ISCTRN 47772397). The endpoints under consideration were rectal bleeding, loose stools, and a global toxicity score. We extract simple rules identifying 3D dose patterns related to a specifically low risk of complication. Normal tissue complication probability (NTCP) models based on parameterized representations of geometrical and volumetric measures resulted in areas under the curve (AUCs) of 0.66, 0.63 and 0.67 for predicting rectal bleeding, loose stools and global toxicity, respectively. In comparison, NTCP models based on standard DVHs performed worse and resulted in AUCs of 0.59 for all three endpoints. In conclusion, we have presented low-dimensional, interpretable and nonlinear NTCP models based on the parameterized representation of the dose to the rectal wall. These models had a higher predictive power than models based on standard DVHs and their low dimensionality allowed for the identification of 3D dose patterns related to a low risk of complication.

  6. Towards a model-based patient selection strategy for proton therapy: External validation of photon-derived Normal Tissue Complication Probability models in a head and neck proton therapy cohort

    PubMed Central

    Blanchard, P; Wong, AJ; Gunn, GB; Garden, AS; Mohamed, ASR; Rosenthal, DI; Crutison, J; Wu, R; Zhang, X; Zhu, XR; Mohan, R; Amin, MV; Fuller, CD; Frank, SJ

    2017-01-01

    Objective To externally validate head and neck cancer (HNC) photon-derived normal tissue complication probability (NTCP) models in patients treated with proton beam therapy (PBT). Methods This prospective cohort consisted of HNC patients treated with PBT at a single institution. NTCP models were selected based on the availability of data for validation and evaluated using the leave-one-out cross-validated area under the curve (AUC) for the receiver operating characteristics curve. Results 192 patients were included. The most prevalent tumor site was oropharynx (n=86, 45%), followed by sinonasal (n=28), nasopharyngeal (n=27) or parotid (n=27) tumors. Apart from the prediction of acute mucositis (reduction of AUC of 0.17), the models overall performed well. The validation (PBT) AUC and the published AUC were respectively 0.90 versus 0.88 for feeding tube 6 months post-PBT; 0.70 versus 0.80 for physician rated dysphagia 6 months post-PBT; 0.70 versus 0.80 for dry mouth 6 months post-PBT; and 0.73 versus 0.85 for hypothyroidism 12 months post-PBT. Conclusion While the drop in NTCP model performance was expected in PBT patients, the models showed robustness and remained valid. Further work is warranted, but these results support the validity of the model-based approach for treatment selection for HNC patients. PMID:27641784

  7. A statistical method to estimate low-energy hadronic cross sections

    NASA Astrophysics Data System (ADS)

    Balassa, Gábor; Kovács, Péter; Wolf, György

    2018-02-01

    In this article we propose a model based on the Statistical Bootstrap approach to estimate the cross sections of different hadronic reactions up to a few GeV in c.m.s. energy. The method is based on the idea, when two particles collide a so-called fireball is formed, which after a short time period decays statistically into a specific final state. To calculate the probabilities we use a phase space description extended with quark combinatorial factors and the possibility of more than one fireball formation. In a few simple cases the probability of a specific final state can be calculated analytically, where we show that the model is able to reproduce the ratios of the considered cross sections. We also show that the model is able to describe proton-antiproton annihilation at rest. In the latter case we used a numerical method to calculate the more complicated final state probabilities. Additionally, we examined the formation of strange and charmed mesons as well, where we used existing data to fit the relevant model parameters.

  8. On Voxel based Iso-Tumor Control Probabilty and Iso-Complication Maps for Selective Boosting and Selective Avoidance Intensity Modulated Radiotherapy.

    PubMed

    Kim, Yusung; Tomé, Wolfgang A

    2008-01-01

    Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans.

  9. Biological mechanisms of normal tissue damage: importance for the design of NTCP models.

    PubMed

    Trott, Klaus-Rüdiger; Doerr, Wolfgang; Facoetti, Angelica; Hopewell, John; Langendijk, Johannes; van Luijk, Peter; Ottolenghi, Andrea; Smyth, Vere

    2012-10-01

    The normal tissue complication probability (NTCP) models that are currently being proposed for estimation of risk of harm following radiotherapy are mainly based on simplified empirical models, consisting of dose distribution parameters, possibly combined with clinical or other treatment-related factors. These are fitted to data from retrospective or prospective clinical studies. Although these models sometimes provide useful guidance for clinical practice, their predictive power on individuals seems to be limited. This paper examines the radiobiological mechanisms underlying the most important complications induced by radiotherapy, with the aim of identifying the essential parameters and functional relationships needed for effective predictive NTCP models. The clinical features of the complications are identified and reduced as much as possible into component parts. In a second step, experimental and clinical data are considered in order to identify the gross anatomical structures involved, and which dose distributions lead to these complications. Finally, the pathogenic pathways and cellular and more specific anatomical parameters that have to be considered in this pathway are determined. This analysis is carried out for some of the most critical organs and sites in radiotherapy, i.e. spinal cord, lung, rectum, oropharynx and heart. Signs and symptoms of severe late normal tissue complications present a very variable picture in the different organs at risk. Only in rare instances is the entire organ the critical target which elicits the particular complication. Moreover, the biological mechanisms that are involved in the pathogenesis differ between the different complications, even in the same organ. Different mechanisms are likely to be related to different shapes of dose effect relationships and different relationships between dose per fraction, dose rate, and overall treatment time and effects. There is good reason to conclude that each type of late complication after radiotherapy depends on its own specific mechanism which is triggered by the radiation exposure of particular structures or sub-volumes of (or related to) the respective organ at risk. Hence each complication will need the development of an NTCP model designed to accommodate this structure. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. Dependence of normal brain integral dose and normal tissue complication probability on the prescription isodose values for γ-knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Ma, Lijun

    2001-11-01

    A recent multi-institutional clinical study suggested possible benefits of lowering the prescription isodose lines for stereotactic radiosurgery procedures. In this study, we investigate the dependence of the normal brain integral dose and the normal tissue complication probability (NTCP) on the prescription isodose values for γ-knife radiosurgery. An analytical dose model was developed for γ-knife treatment planning. The dose model was commissioned by fitting the measured dose profiles for each helmet size. The dose model was validated by comparing its results with the Leksell gamma plan (LGP, version 5.30) calculations. The normal brain integral dose and the NTCP were computed and analysed for an ensemble of treatment cases. The functional dependence of the normal brain integral dose and the NCTP versus the prescribing isodose values was studied for these cases. We found that the normal brain integral dose and the NTCP increase significantly when lowering the prescription isodose lines from 50% to 35% of the maximum tumour dose. Alternatively, the normal brain integral dose and the NTCP decrease significantly when raising the prescribing isodose lines from 50% to 65% of the maximum tumour dose. The results may be used as a guideline for designing future dose escalation studies for γ-knife applications.

  11. The role of high fat diet in the development of complications of chronic pancreatitis.

    PubMed

    Castiñeira-Alvariño, M; Lindkvist, B; Luaces-Regueira, M; Iglesias-García, J; Lariño-Noia, J; Nieto-García, L; Domínguez-Muñoz, J E

    2013-10-01

    Little is known about risk factors for complications in chronic pancreatitis (CP). High fat diet (HFD) has been demonstrated to aggravate pancreatic injury in animal models. The aim of this study was to investigate the role of HFD in age at diagnosis of CP and probability of CP related complications. A cross-sectional case-case study was performed within a prospectively collected cohort of patients with CP. Diagnosis and morphological severity of CP was established by endoscopic ultrasound. Pancreatic exocrine insufficiency (PEI) was diagnosed by ¹³C mixed triglyceride breath test. Fat intake was assessed by a specific nutritional questionnaire. Odds ratios (OR) for CP related complications were estimated by multivariate logistic regression analysis. 168 patients were included (128 (76.2%) men, mean age 44 years (SD 13.5)). Etiology of CP was alcohol abuse in 89 patients (53.0%), other causes in 30 (17.9%) and idiopathic in the remaining 49 subjects (29.2%). 24 patients (14.3%) had a HFD. 68 patients (40.5%) had continuous abdominal pain, 39 (23.2%) PEI and 43 (25.7%) morphologically severe CP. HFD was associated with an increased probability for continuous abdominal pain (OR = 2.84 (95% CI, 1.06-7.61)), and a younger age at diagnosis (37.0 ± 13.9 versus 45.8 ± 13.0 years, p = 0.03) but not with CP related complications after adjusting for sex, years of follow-up, alcohol and tobacco consumption, etiology and body mass index. Compared with a normal fat diet, HFD is associated with a younger age at diagnosis of CP and continuous abdominal pain, but not with severity and complications of the disease. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  12. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  13. Cost-effectiveness of endovascular versus open repair of acute complicated type B aortic dissections.

    PubMed

    Luebke, Thomas; Brunkwall, Jan

    2014-05-01

    This study weighed the cost and benefit of thoracic endovascular aortic repair (TEVAR) vs open repair (OR) in the treatment of an acute complicated type B aortic dissection by (TBAD) estimating the cost-effectiveness to determine an optimal treatment strategy based on the best currently available evidence. A cost-utility analysis from the perspective of the health system payer was performed using a decision analytic model. Within this model, the 1-year survival, quality-adjusted life-years (QALYs), and costs for a hypothetical cohort of patients with an acute complicated TBAD managed with TEVAR or OR were evaluated. Clinical effectiveness data, cost data, and transitional probabilities of different health states were derived from previously published high-quality studies or meta-analyses. Probabilistic sensitivity analyses were performed on uncertain model parameters. The base-case analysis showed, in terms of QALYs, that OR appeared to be more expensive (incremental cost of €17,252.60) and less effective (-0.19 QALYs) compared with TEVAR; hence, in terms of the incremental cost-effectiveness ratio, OR was dominated by TEVAR. As a result, the incremental cost-effectiveness ratio (ie, the cost per life-year saved) was not calculated. The average cost-effectiveness ratio of TEVAR and OR per QALY gained was €56,316.79 and €108,421.91, respectively. In probabilistic sensitivity analyses, TEVAR was economically dominant in 100% of cases. The probability that TEVAR was economically attractive at a willingness-to-pay threshold of €50,000/QALY gained was 100%. The present results suggest that TEVAR yielded more QALYs and was associated with lower 1-year costs compared with OR in patients with an acute complicated TBAD. As a result, from the cost-effectiveness point of view, TEVAR is the dominant therapy over OR for this disease under the predefined conditions. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  14. Radiotherapy of soft tissue sarcomas in dogs.

    PubMed

    McChesney, S L; Withrow, S J; Gillette, E L; Powers, B E; Dewhirst, M W

    1989-01-01

    Megavoltage radiotherapy was administered to 42 dogs with soft tissue sarcoma. Acceptable local control of these aggressive tumors was achieved after one year of treatment. Control rates of 48 and 67% were obtained at doses of 45 and 50 gray (Gy), respectively. At 2 years, control rates decreased to 33% at the dose of 50 Gy. Serious complications developed in 4 of 42 dogs at doses of 40 to 50 Gy. The estimated dose with a 50% probability for causing serious complications was 54 Gy, given in 10 fractions. We believe that the large doses per fraction used in this study probably led to an increased probability for necrosis. Hemangiopericytomas seemed to be more responsive than fibrosarcomas. Only 2 of 11 recurrent tumors were controlled with surgery. Good local control was achieved with radiation alone for one year at doses with a low probability for serious complications; however, higher total radiation doses or combined modalities, such as surgery and radiation or radiation and hyperthermia, may be needed for longer-term control.

  15. On the radiobiological impact of metal artifacts in head-and-neck IMRT in terms of tumor control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Kim, Yusung; Tomé, Wolfgang A

    2007-11-01

    To investigate the effects of distorted head-and-neck (H&N) intensity-modulated radiation therapy (IMRT) dose distributions (hot and cold spots) on normal tissue complication probability (NTCP) and tumor control probability (TCP) due to dental-metal artifacts. Five patients' IMRT treatment plans have been analyzed, employing five different planning image data-sets: (a) uncorrected (UC); (b) homogeneous uncorrected (HUC); (c) sinogram completion corrected (SCC); (d) minimum-value-corrected (MVC); and (e) streak-artifact-reduction including minimum-value-correction (SAR-MVC), which has been taken as the reference data-set. The effects on NTCP and TCP were evaluated using the Lyman-NTCP model and the Logistic-TCP model, respectively. When compared to the predicted NTCP obtained using the reference data-set, the treatment plan based on the original CT data-set (UC) yielded an increase in NTCP of 3.2 and 2.0% for the spared parotid gland and the spinal cord, respectively. While for the treatment plans based on the MVC CT data-set the NTCP increased by a 1.1% and a 0.1% for the spared parotid glands and the spinal cord, respectively. In addition, the MVC correction method showed a reduction in TCP for target volumes (MVC: delta TCP = -0.6% vs. UC: delta TCP = -1.9%) with respect to that of the reference CT data-set. Our results indicate that the presence of dental-metal-artifacts in H&N planning CT data-sets has an impact on the estimates of TCP and NTCP. In particular dental-metal-artifacts lead to an increase in NTCP for the spared parotid glands and a slight decrease in TCP for target volumes.

  16. Reduction of cardiac and pulmonary complication probabilities after breathing adapted radiotherapy for breast cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korreman, Stine S.; Pedersen, Anders N.; Aarup, Lasse Rye

    Purpose: Substantial reductions of cardio-pulmonary radiation doses can be achieved using voluntary deep inspiration breath-hold (DIBH) or free breathing inspiration gating (IG) in radiotherapy after conserving surgery for breast cancer. The purpose of this study is to evaluate the radiobiological implications of such dosimetric benefits. Methods and Materials: Patients from previously reported studies were pooled for a total of 33 patients. All patients underwent DIBH and free breathing (FB) scans, and 17 patients underwent an additional IG scan. Tangential conformal treatment plans covering the remaining breast, internal mammary, and periclavicular nodes were optimized for each scan, prescription dose 48 Gy.more » Normal tissue complication probabilities were calculated using the relative seriality model for the heart, and the model proposed by Burman et al. for the lung. Results: Previous computed tomography studies showed that both voluntary DIBH and IG provided reduction of the lung V{sub 5} (relative volume receiving more than 50% of prescription dose) on the order of 30-40%, and a 80-90% reduction of the heart V{sub 5} for left-sided cancers. Corresponding pneumonitis probability of 28.1% (range, 0.7-95.6%) for FB could be reduced to 2.6% (range, 0.1-40.1%) for IG, and 4.3% (range, 0.1-59%) for DIBH. The cardiac mortality probability could be reduced from 4.8% (range, 0.1-23.4%) in FB to 0.5% (range, 0.1-2.6%) for IG and 0.1% (range, 0-3.0%) for DIBH. Conclusions: Remarkable potential is shown for simple voluntary DIBH and free breathing IG to reduce the risk of both cardiac mortality and pneumonitis for the common technique of adjuvant tangential breast irradiation.« less

  17. The effects of small field dosimetry on the biological models used in evaluating IMRT dose distributions

    NASA Astrophysics Data System (ADS)

    Cardarelli, Gene A.

    The primary goal in radiation oncology is to deliver lethal radiation doses to tumors, while minimizing dose to normal tissue. IMRT has the capability to increase the dose to the targets and decrease the dose to normal tissue, increasing local control, decrease toxicity and allow for effective dose escalation. This advanced technology does present complex dose distributions that are not easily verified. Furthermore, the dose inhomogeneity caused by non-uniform dose distributions seen in IMRT treatments has caused the development of biological models attempting to characterize the dose-volume effect in the response of organized tissues to radiation. Dosimetry of small fields can be quite challenging when measuring dose distributions for high-energy X-ray beams used in IMRT. The proper modeling of these small field distributions is essential in reproducing accurate dose for IMRT. This evaluation was conducted to quantify the effects of small field dosimetry on IMRT plan dose distributions and the effects on four biological model parameters. The four biological models evaluated were: (1) the generalized Equivalent Uniform Dose (gEUD), (2) the Tumor Control Probability (TCP), (3) the Normal Tissue Complication Probability (NTCP) and (4) the Probability of uncomplicated Tumor Control (P+). These models are used to estimate local control, survival, complications and uncomplicated tumor control. This investigation compares three distinct small field dose algorithms. Dose algorithms were created using film, small ion chamber, and a combination of ion chamber measurements and small field fitting parameters. Due to the nature of uncertainties in small field dosimetry and the dependence of biological models on dose volume information, this examination quantifies the effects of small field dosimetry techniques on radiobiological models and recommends pathways to reduce the errors in using these models to evaluate IMRT dose distributions. This study demonstrates the importance of valid physical dose modeling prior to the use of biological modeling. The success of using biological function data, such as hypoxia, in clinical IMRT planning will greatly benefit from the results of this study.

  18. Multivariate Normal Tissue Complication Probability Modeling of Heart Valve Dysfunction in Hodgkin Lymphoma Survivors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cella, Laura, E-mail: laura.cella@cnr.it; Department of Advanced Biomedical Sciences, Federico II University School of Medicine, Naples; Liuzzi, Raffaele

    Purpose: To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced asymptomatic heart valvular defects (RVD). Methods and Materials: Fifty-six patients treated with sequential chemoradiation therapy for Hodgkin lymphoma (HL) were retrospectively reviewed for RVD events. Clinical information along with whole heart, cardiac chambers, and lung dose distribution parameters was collected, and the correlations to RVD were analyzed by means of Spearman's rank correlation coefficient (Rs). For the selection of the model order and parameters for NTCP modeling, a multivariate logistic regression method using resampling techniques (bootstrapping) was applied. Model performance was evaluated using the area under themore » receiver operating characteristic curve (AUC). Results: When we analyzed the whole heart, a 3-variable NTCP model including the maximum dose, whole heart volume, and lung volume was shown to be the optimal predictive model for RVD (Rs = 0.573, P<.001, AUC = 0.83). When we analyzed the cardiac chambers individually, for the left atrium and for the left ventricle, an NTCP model based on 3 variables including the percentage volume exceeding 30 Gy (V30), cardiac chamber volume, and lung volume was selected as the most predictive model (Rs = 0.539, P<.001, AUC = 0.83; and Rs = 0.557, P<.001, AUC = 0.82, respectively). The NTCP values increase as heart maximum dose or cardiac chambers V30 increase. They also increase with larger volumes of the heart or cardiac chambers and decrease when lung volume is larger. Conclusions: We propose logistic NTCP models for RVD considering not only heart irradiation dose but also the combined effects of lung and heart volumes. Our study establishes the statistical evidence of the indirect effect of lung size on radio-induced heart toxicity.« less

  19. Parotid Gland Function After Radiotherapy: The Combined Michigan and Utrecht Experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dijkema, Tim, E-mail: T.Dijkema@umcutrecht.n; Raaijmakers, Cornelis P.J.; Ten Haken, Randall K.

    2010-10-01

    Purpose: To analyze the combined and updated results from the University of Michigan and University Medical Center Utrecht on normal tissue complication probability (NTCP) of the parotid gland 1 year after radiotherapy (RT) for head-and-neck (HN) cancer. Patients and Methods: A total of 222 prospectively analyzed patients with various HN malignancies were treated with conventional and intensity-modulated RT. Stimulated individual parotid gland flow rates were measured before RT and 1 year after RT using Lashley cups at both centers. A flow ratio <25% of pretreatment was defined as a complication. The data were fitted to the Lyman-Kutcher-Burman model. Results: Amore » total of 384 parotid glands (Michigan: 157; Utrecht: 227 glands) was available for analysis 1 year after RT. Combined NTCP analysis based on mean dose resulted in a TD{sub 50} (uniform dose leading to 50% complication probability) of 39.9 Gy and m (steepness of the curve) of 0.40. The resulting NTCP curve had good qualitative agreement with the combined clinical data. Mean doses of 25-30 Gy were associated with 17-26% NTCP. Conclusions: A definite NTCP curve for parotid gland function 1 year after RT is presented, based on mean dose. No threshold dose was observed, and TD{sub 50} was equal to 40 Gy.« less

  20. On Voxel based Iso-Tumor Control Probabilty and Iso-Complication Maps for Selective Boosting and Selective Avoidance Intensity Modulated Radiotherapy

    PubMed Central

    Kim, Yusung; Tomé, Wolfgang A.

    2010-01-01

    Summary Voxel based iso-Tumor Control Probability (TCP) maps and iso-Complication maps are proposed as a plan-review tool especially for functional image-guided intensity-modulated radiotherapy (IMRT) strategies such as selective boosting (dose painting) and conformal avoidance IMRT. The maps employ voxel-based phenomenological biological dose-response models for target volumes and normal organs. Two IMRT strategies for prostate cancer, namely conventional uniform IMRT delivering an EUD = 84 Gy (equivalent uniform dose) to the entire PTV and selective boosting delivering an EUD = 82 Gy to the entire PTV, are investigated, to illustrate the advantages of this approach over iso-dose maps. Conventional uniform IMRT did yield a more uniform isodose map to the entire PTV while selective boosting did result in a nonuniform isodose map. However, when employing voxel based iso-TCP maps selective boosting exhibited a more uniform tumor control probability map compared to what could be achieved using conventional uniform IMRT, which showed TCP cold spots in high-risk tumor subvolumes despite delivering a higher EUD to the entire PTV. Voxel based iso-Complication maps are presented for rectum and bladder, and their utilization for selective avoidance IMRT strategies are discussed. We believe as the need for functional image guided treatment planning grows, voxel based iso-TCP and iso-Complication maps will become an important tool to assess the integrity of such treatment plans. PMID:21151734

  1. Derivation of the expressions for γ50 and D50 for different individual TCP and NTCP models

    NASA Astrophysics Data System (ADS)

    Stavreva, N.; Stavrev, P.; Warkentin, B.; Fallone, B. G.

    2002-10-01

    This paper presents a complete set of formulae for the position (D50) and the normalized slope (γ50) of the dose-response relationship based on the most commonly used radiobiological models for tumours as well as for normal tissues. The functional subunit response models (critical element and critical volume) are used in the derivation of the formulae for the normal tissue. Binomial statistics are used to describe the tumour control probability, the functional subunit response as well as the normal tissue complication probability. The formulae are derived for the single hit and linear quadratic models of cell kill in terms of the number of fractions and dose per fraction. It is shown that the functional subunit models predict very steep, almost step-like, normal tissue individual dose-response relationships. Furthermore, the formulae for the normalized gradient depend on the cellular parameters α and β when written in terms of number of fractions, but not when written in terms of dose per fraction.

  2. Biological Modeling Based Outcome Analysis (BMOA) in 3D Conformal Radiation Therapy (3DCRT) Treatments for Lung and Breast Cancers

    NASA Astrophysics Data System (ADS)

    Pyakuryal, Anil; Chen, Chiu-Hao; Dhungana, Sudarshan

    2010-03-01

    3DCRT treatments are the most commonly used techniques in the treatment of lung and breast cancers. The purpose of this study was to perform the BMOA of the 3DCRT plans designed for the treatment of breast and lung cancers utilizing HART program (Med. Phys. 36, p.2547(2009)). The BMOA parameters include normal tissue complication probability (NTCP), tumor control probability (TCP), and the complication-free tumor control probability (P+). The 3DCRT plans were designed for (i) the palliative treatment of 8 left lung cancer patients (CPs) at early stage (m=8), (ii) the curative treatment of 8 left lung CPs at stages II and III (k=8), and (iii) the curative treatment of 8 left breast CPs (n=8). The NTCPs were noticeably small (<2%) for heart, lungs and cord in both types of treatments except for the esophagus in lung CPs (k=8). Assessments of the TCPs and P+s also indicated good improvements in local tumor control in all plans. Homogeneous target coverage and improved dose conformality were the major advantages of such techniques in the treatment of breast cancer. These achievements support the efficacy of the 3DCRT techniques for the efficient treatment of various types of cancer.

  3. Assessing the uncertainty in a normal tissue complication probability difference (∆NTCP): radiation-induced liver disease (RILD) in liver tumour patients treated with proton vs X-ray therapy.

    PubMed

    Kobashi, Keiji; Prayongrat, Anussara; Kimoto, Takuya; Toramatsu, Chie; Dekura, Yasuhiro; Katoh, Norio; Shimizu, Shinichi; Ito, Yoichi M; Shirato, Hiroki

    2018-03-01

    Modern radiotherapy technologies such as proton beam therapy (PBT) permit dose escalation to the tumour and minimize unnecessary doses to normal tissues. To achieve appropriate patient selection for PBT, a normal tissue complication probability (NTCP) model can be applied to estimate the risk of treatment-related toxicity relative to X-ray therapy (XRT). A methodology for estimating the difference in NTCP (∆NTCP), including its uncertainty as a function of dose to normal tissue, is described in this study using the Delta method, a statistical method for evaluating the variance of functions, considering the variance-covariance matrix. We used a virtual individual patient dataset of radiation-induced liver disease (RILD) in liver tumour patients who were treated with XRT as a study model. As an alternative option for individual patient data, dose-bin data, which consists of the number of patients who developed toxicity in each dose level/bin and the total number of patients in that dose level/bin, are useful for multi-institutional data sharing. It provides comparable accuracy with individual patient data when using the Delta method. With reliable NTCP models, the ∆NTCP with uncertainty might potentially guide the use of PBT; however, clinical validation and a cost-effectiveness study are needed to determine the appropriate ∆NTCP threshold.

  4. Assessing the uncertainty in a normal tissue complication probability difference (∆NTCP): radiation-induced liver disease (RILD) in liver tumour patients treated with proton vs X-ray therapy

    PubMed Central

    Kobashi, Keiji; Kimoto, Takuya; Toramatsu, Chie; Dekura, Yasuhiro; Katoh, Norio; Shimizu, Shinichi; Ito, Yoichi M; Shirato, Hiroki

    2018-01-01

    Abstract Modern radiotherapy technologies such as proton beam therapy (PBT) permit dose escalation to the tumour and minimize unnecessary doses to normal tissues. To achieve appropriate patient selection for PBT, a normal tissue complication probability (NTCP) model can be applied to estimate the risk of treatment-related toxicity relative to X-ray therapy (XRT). A methodology for estimating the difference in NTCP (∆NTCP), including its uncertainty as a function of dose to normal tissue, is described in this study using the Delta method, a statistical method for evaluating the variance of functions, considering the variance–covariance matrix. We used a virtual individual patient dataset of radiation-induced liver disease (RILD) in liver tumour patients who were treated with XRT as a study model. As an alternative option for individual patient data, dose-bin data, which consists of the number of patients who developed toxicity in each dose level/bin and the total number of patients in that dose level/bin, are useful for multi-institutional data sharing. It provides comparable accuracy with individual patient data when using the Delta method. With reliable NTCP models, the ∆NTCP with uncertainty might potentially guide the use of PBT; however, clinical validation and a cost-effectiveness study are needed to determine the appropriate ∆NTCP threshold. PMID:29538699

  5. Neyman, Markov processes and survival analysis.

    PubMed

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  6. Location identification for indoor instantaneous point contaminant source by probability-based inverse Computational Fluid Dynamics modeling.

    PubMed

    Liu, X; Zhai, Z

    2008-02-01

    Indoor pollutions jeopardize human health and welfare and may even cause serious morbidity and mortality under extreme conditions. To effectively control and improve indoor environment quality requires immediate interpretation of pollutant sensor readings and accurate identification of indoor pollution history and source characteristics (e.g. source location and release time). This procedure is complicated by non-uniform and dynamic contaminant indoor dispersion behaviors as well as diverse sensor network distributions. This paper introduces a probability concept based inverse modeling method that is able to identify the source location for an instantaneous point source placed in an enclosed environment with known source release time. The study presents the mathematical models that address three different sensing scenarios: sensors without concentration readings, sensors with spatial concentration readings, and sensors with temporal concentration readings. The paper demonstrates the inverse modeling method and algorithm with two case studies: air pollution in an office space and in an aircraft cabin. The predictions were successfully verified against the forward simulation settings, indicating good capability of the method in finding indoor pollutant sources. The research lays a solid ground for further study of the method for more complicated indoor contamination problems. The method developed can help track indoor contaminant source location with limited sensor outputs. This will ensure an effective and prompt execution of building control strategies and thus achieve a healthy and safe indoor environment. The method can also assist the design of optimal sensor networks.

  7. Mathematical Modelling for Patient Selection in Proton Therapy.

    PubMed

    Mee, T; Kirkby, N F; Kirkby, K J

    2018-05-01

    Proton beam therapy (PBT) is still relatively new in cancer treatment and the clinical evidence base is relatively sparse. Mathematical modelling offers assistance when selecting patients for PBT and predicting the demand for service. Discrete event simulation, normal tissue complication probability, quality-adjusted life-years and Markov Chain models are all mathematical and statistical modelling techniques currently used but none is dominant. As new evidence and outcome data become available from PBT, comprehensive models will emerge that are less dependent on the specific technologies of radiotherapy planning and delivery. Copyright © 2018 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  8. Predicting Grade 3 Acute Diarrhea During Radiation Therapy for Rectal Cancer Using a Cutoff-Dose Logistic Regression Normal Tissue Complication Probability Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, John M., E-mail: jrobertson@beaumont.ed; Soehn, Matthias; Yan Di

    Purpose: Understanding the dose-volume relationship of small bowel irradiation and severe acute diarrhea may help reduce the incidence of this side effect during adjuvant treatment for rectal cancer. Methods and Materials: Consecutive patients treated curatively for rectal cancer were reviewed, and the maximum grade of acute diarrhea was determined. The small bowel was outlined on the treatment planning CT scan, and a dose-volume histogram was calculated for the initial pelvic treatment (45 Gy). Logistic regression models were fitted for varying cutoff-dose levels from 5 to 45 Gy in 5-Gy increments. The model with the highest LogLikelihood was used to developmore » a cutoff-dose normal tissue complication probability (NTCP) model. Results: There were a total of 152 patients (48% preoperative, 47% postoperative, 5% other), predominantly treated prone (95%) with a three-field technique (94%) and a protracted venous infusion of 5-fluorouracil (78%). Acute Grade 3 diarrhea occurred in 21%. The largest LogLikelihood was found for the cutoff-dose logistic regression model with 15 Gy as the cutoff-dose, although the models for 20 Gy and 25 Gy had similar significance. According to this model, highly significant correlations (p <0.001) between small bowel volumes receiving at least 15 Gy and toxicity exist in the considered patient population. Similar findings applied to both the preoperatively (p = 0.001) and postoperatively irradiated groups (p = 0.001). Conclusion: The incidence of Grade 3 diarrhea was significantly correlated with the volume of small bowel receiving at least 15 Gy using a cutoff-dose NTCP model.« less

  9. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  10. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  11. Combining the ASA Physical Classification System and Continuous Intraoperative Surgical Apgar Score Measurement in Predicting Postoperative Risk.

    PubMed

    Jering, Monika Zdenka; Marolen, Khensani N; Shotwell, Matthew S; Denton, Jason N; Sandberg, Warren S; Ehrenfeld, Jesse Menachem

    2015-11-01

    The surgical Apgar score predicts major 30-day postoperative complications using data assessed at the end of surgery. We hypothesized that evaluating the surgical Apgar score continuously during surgery may identify patients at high risk for postoperative complications. We retrospectively identified general, vascular, and general oncology patients at Vanderbilt University Medical Center. Logistic regression methods were used to construct a series of predictive models in order to continuously estimate the risk of major postoperative complications, and to alert care providers during surgery should the risk exceed a given threshold. Area under the receiver operating characteristic curve (AUROC) was used to evaluate the discriminative ability of a model utilizing a continuously measured surgical Apgar score relative to models that use only preoperative clinical factors or continuously monitored individual constituents of the surgical Apgar score (i.e. heart rate, blood pressure, and blood loss). AUROC estimates were validated internally using a bootstrap method. 4,728 patients were included. Combining the ASA PS classification with continuously measured surgical Apgar score demonstrated improved discriminative ability (AUROC 0.80) in the pooled cohort compared to ASA (0.73) and the surgical Apgar score alone (0.74). To optimize the tradeoff between inadequate and excessive alerting with future real-time notifications, we recommend a threshold probability of 0.24. Continuous assessment of the surgical Apgar score is predictive for major postoperative complications. In the future, real-time notifications might allow for detection and mitigation of changes in a patient's accumulating risk of complications during a surgical procedure.

  12. Fire-probability maps for the Brazilian Amazonia

    NASA Astrophysics Data System (ADS)

    Cardoso, M.; Nobre, C.; Obregon, G.; Sampaio, G.

    2009-04-01

    Most fires in Amazonia result from the combination between climate and land-use factors. They occur mainly in the dry season and are used as an inexpensive tool for land clearing and management. However, their unintended consequences are of important concern. Fire emissions are the most important sources of greenhouse gases and aerosols in the region, accidental fires are a major threat to protected areas, and frequent fires may lead to permanent conversion of forest areas into savannas. Fire-activity models have thus become important tools for environmental analyses in Amazonia. They are used, for example, in warning systems for monitoring the risk of burnings in protected areas, to improve the description of biogeochemical cycles and vegetation composition in ecosystem models, and to help estimate the long-term potential for savannas in biome models. Previous modeling studies for the whole region were produced in units of satellite fire pixels, which complicate their direct use for environmental applications. By reinterpreting remote-sensing based data using a statistical approach, we were able to calibrate models for the whole region in units of probability, or chance of fires to occur. The application of these models for years 2005 and 2006 provided maps of fire potential at 3-month and 0.25-deg resolution as a function of precipitation and distance from main roads. In both years, the performance of the resulting maps was better for the period July-September. During these months, most of satellite-based fire observations were located in areas with relatively high chance of fire, as determined by the modeled probability maps. In addition to reproduce reasonably well the areas presenting maximum fire activity as detected by remote sensing, the new results in units of probability are easier to apply than previous estimates from fire-pixel models.

  13. Fire-probability maps for the Brazilian Amazonia

    NASA Astrophysics Data System (ADS)

    Cardoso, Manoel; Sampaio, Gilvan; Obregon, Guillermo; Nobre, Carlos

    2010-05-01

    Most fires in Amazonia result from the combination between climate and land-use factors. They occur mainly in the dry season and are used as an inexpensive tool for land clearing and management. However, their unintended consequences are of important concern. Fire emissions are the most important sources of greenhouse gases and aerosols in the region, accidental fires are a major threat to protected areas, and frequent fires may lead to permanent conversion of forest areas into savannas. Fire-activity models have thus become important tools for environmental analyses in Amazonia. They are used, for example, in warning systems for monitoring the risk of burnings in protected areas, to improve the description of biogeochemical cycles and vegetation composition in ecosystem models, and to help estimate the long-term potential for savannas in biome models. Previous modeling studies for the whole region were produced in units of satellite fire pixels, which complicate their direct use for environmental applications. By reinterpreting remote-sensing based data using a statistical approach, we were able to calibrate models for the whole region in units of probability, or chance of fires to occur. The application of these models for years 2005 and 2006 provided maps of fire potential at 3-month and 0.25-deg resolution as a function of precipitation and distance from main roads. In both years, the performance of the resulting maps was better for the period July-September. During these months, most of satellite-based fire observations were located in areas with relatively high chance of fire, as determined by the modeled probability maps. In addition to reproduce reasonably well the areas presenting maximum fire activity as detected by remote sensing, the new results in units of probability are easier to apply than previous estimates from fire-pixel models.

  14. Method for Automatic Selection of Parameters in Normal Tissue Complication Probability Modeling.

    PubMed

    Christophides, Damianos; Appelt, Ane L; Gusnanto, Arief; Lilley, John; Sebag-Montefiore, David

    2018-07-01

    To present a fully automatic method to generate multiparameter normal tissue complication probability (NTCP) models and compare its results with those of a published model, using the same patient cohort. Data were analyzed from 345 rectal cancer patients treated with external radiation therapy to predict the risk of patients developing grade 1 or ≥2 cystitis. In total, 23 clinical factors were included in the analysis as candidate predictors of cystitis. Principal component analysis was used to decompose the bladder dose-volume histogram into 8 principal components, explaining more than 95% of the variance. The data set of clinical factors and principal components was divided into training (70%) and test (30%) data sets, with the training data set used by the algorithm to compute an NTCP model. The first step of the algorithm was to obtain a bootstrap sample, followed by multicollinearity reduction using the variance inflation factor and genetic algorithm optimization to determine an ordinal logistic regression model that minimizes the Bayesian information criterion. The process was repeated 100 times, and the model with the minimum Bayesian information criterion was recorded on each iteration. The most frequent model was selected as the final "automatically generated model" (AGM). The published model and AGM were fitted on the training data sets, and the risk of cystitis was calculated. The 2 models had no significant differences in predictive performance, both for the training and test data sets (P value > .05) and found similar clinical and dosimetric factors as predictors. Both models exhibited good explanatory performance on the training data set (P values > .44), which was reduced on the test data sets (P values < .05). The predictive value of the AGM is equivalent to that of the expert-derived published model. It demonstrates potential in saving time, tackling problems with a large number of parameters, and standardizing variable selection in NTCP modeling. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.

  15. Analysis and selection of magnitude relations for the Working Group on Utah Earthquake Probabilities

    USGS Publications Warehouse

    Duross, Christopher; Olig, Susan; Schwartz, David

    2015-01-01

    Prior to calculating time-independent and -dependent earthquake probabilities for faults in the Wasatch Front region, the Working Group on Utah Earthquake Probabilities (WGUEP) updated a seismic-source model for the region (Wong and others, 2014) and evaluated 19 historical regressions on earthquake magnitude (M). These regressions relate M to fault parameters for historical surface-faulting earthquakes, including linear fault length (e.g., surface-rupture length [SRL] or segment length), average displacement, maximum displacement, rupture area, seismic moment (Mo ), and slip rate. These regressions show that significant epistemic uncertainties complicate the determination of characteristic magnitude for fault sources in the Basin and Range Province (BRP). For example, we found that M estimates (as a function of SRL) span about 0.3–0.4 units (figure 1) owing to differences in the fault parameter used; age, quality, and size of historical earthquake databases; and fault type and region considered.

  16. A Pearson VII distribution function for fast calculation of dechanneling and angular dispersion of beams

    NASA Astrophysics Data System (ADS)

    Shao, Lin; Peng, Luohan

    2009-12-01

    Although multiple scattering theories have been well developed, numerical calculation is complicated and only tabulated values have been available, which has caused inconvenience in practical use. We have found that a Pearson VII distribution function can be used to fit Lugujjo and Mayer's probability curves in describing the dechanneling phenomenon in backscattering analysis, over a wide range of disorder levels. Differentiation of the obtained function gives another function to calculate angular dispersion of the beam in the frameworks by Sigmund and Winterbon. The present work provides an easy calculation of both dechanneling probability and angular dispersion for any arbitrary combination of beam and target having a reduced thickness ⩾0.6, which can be implemented in modeling of channeling spectra. Furthermore, we used a Monte Carlo simulation program to calculate the deflection probability and compared them with previously tabulated data. A good agreement was reached.

  17. Multivariable normal tissue complication probability model-based treatment plan optimization for grade 2-4 dysphagia and tube feeding dependence in head and neck radiotherapy.

    PubMed

    Kierkels, Roel G J; Wopken, Kim; Visser, Ruurd; Korevaar, Erik W; van der Schaaf, Arjen; Bijl, Hendrik P; Langendijk, Johannes A

    2016-12-01

    Radiotherapy of the head and neck is challenged by the relatively large number of organs-at-risk close to the tumor. Biologically-oriented objective functions (OF) could optimally distribute the dose among the organs-at-risk. We aimed to explore OFs based on multivariable normal tissue complication probability (NTCP) models for grade 2-4 dysphagia (DYS) and tube feeding dependence (TFD). One hundred head and neck cancer patients were studied. Additional to the clinical plan, two more plans (an OF DYS and OF TFD -plan) were optimized per patient. The NTCP models included up to four dose-volume parameters and other non-dosimetric factors. A fully automatic plan optimization framework was used to optimize the OF NTCP -based plans. All OF NTCP -based plans were reviewed and classified as clinically acceptable. On average, the Δdose and ΔNTCP were small comparing the OF DYS -plan, OF TFD -plan, and clinical plan. For 5% of patients NTCP TFD reduced >5% using OF TFD -based planning compared to the OF DYS -plans. Plan optimization using NTCP DYS - and NTCP TFD -based objective functions resulted in clinically acceptable plans. For patients with considerable risk factors of TFD, the OF TFD steered the optimizer to dose distributions which directly led to slightly lower predicted NTCP TFD values as compared to the other studied plans. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. A cost analysis of stenting in uncomplicated semirigid ureteroscopic stone removal.

    PubMed

    Seklehner, Stephan; Sievert, Karl-Dietrich; Lee, Richard; Engelhardt, Paul F; Riedl, Claus; Kunit, Thomas

    2017-05-01

    To evaluate the outcome and the costs of stenting in uncomplicated semirigid ureteroscopic stone removal. A decision tree model was created to evaluate the economic impact of routine stenting versus non-stenting strategies in uncomplicated ureteroscopy (URS). Probabilities of complications were extracted from twelve randomized controlled trials. Stone removal costs, costs for complication management, and total costs were calculated using Treeage Pro (TreeAge Pro Healthcare version 2015, Software, Inc, Williamstown Massachusetts, USA). Stone removal costs were higher in stented URS (€1512.25 vs. €1681.21, respectively). Complication management costs were higher in non-stented procedures. Both for complications treated conservatively (€189.43 vs. €109.67) and surgically (€49.26 vs. €24.83). When stone removal costs, costs for stent removal, and costs for complication management were considered, uncomplicated URS with stent placement yielded an overall cost per patient of €1889.15 compared to €1750.94 without stent placement. The incremental costs of stented URS were €138.25 per procedure. Semirigid URS with stent placement leads to higher direct procedural costs. Costs for managing URS-related complications are higher in non-stented procedures. Overall, a standard strategy of deferring routine stenting uncomplicated ureteroscopic stone removal is more cost efficient.

  19. Assessment of normal tissue complications following prostate cancer irradiation: Comparison of radiation treatment modalities using NTCP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takam, Rungdham; Bezak, Eva; Yeoh, Eric E.

    2010-09-15

    Purpose: Normal tissue complication probability (NTCP) of the rectum, bladder, urethra, and femoral heads following several techniques for radiation treatment of prostate cancer were evaluated applying the relative seriality and Lyman models. Methods: Model parameters from literature were used in this evaluation. The treatment techniques included external (standard fractionated, hypofractionated, and dose-escalated) three-dimensional conformal radiotherapy (3D-CRT), low-dose-rate (LDR) brachytherapy (I-125 seeds), and high-dose-rate (HDR) brachytherapy (Ir-192 source). Dose-volume histograms (DVHs) of the rectum, bladder, and urethra retrieved from corresponding treatment planning systems were converted to biological effective dose-based and equivalent dose-based DVHs, respectively, in order to account for differences inmore » radiation treatment modality and fractionation schedule. Results: Results indicated that with hypofractionated 3D-CRT (20 fractions of 2.75 Gy/fraction delivered five times/week to total dose of 55 Gy), NTCP of the rectum, bladder, and urethra were less than those for standard fractionated 3D-CRT using a four-field technique (32 fractions of 2 Gy/fraction delivered five times/week to total dose of 64 Gy) and dose-escalated 3D-CRT. Rectal and bladder NTCPs (5.2% and 6.6%, respectively) following the dose-escalated four-field 3D-CRT (2 Gy/fraction to total dose of 74 Gy) were the highest among analyzed treatment techniques. The average NTCP for the rectum and urethra were 0.6% and 24.7% for LDR-BT and 0.5% and 11.2% for HDR-BT. Conclusions: Although brachytherapy techniques resulted in delivering larger equivalent doses to normal tissues, the corresponding NTCPs were lower than those of external beam techniques other than the urethra because of much smaller volumes irradiated to higher doses. Among analyzed normal tissues, the femoral heads were found to have the lowest probability of complications as most of their volume was irradiated to lower equivalent doses compared to other tissues.« less

  20. Ventricular Catheter Systems with Subcutaneous Reservoirs (Ommaya Reservoirs) in Pediatric Patients with Brain Tumors: Infections and Other Complications.

    PubMed

    Gerber, Nicolas U; Müller, Anna; Bellut, David; Bozinov, Oliver; Berger, Christoph; Grotzer, Michael A

    2015-12-01

    This study aims to describe complications related to ventricular catheter systems with subcutaneous reservoirs (VCSR) (such as Ommaya reservoirs) in pediatric patients with brain tumors. Retrospective analysis of consecutive patients with a total of 31 VCSR treated at the Children's University Hospital of Zurich, Switzerland. A total of 20 patients with a median age of 3.3 years at VCSR implantation received 31 VCSR. Overall, 19 complications in 11 patients were recorded: 7 patients had a VCSR-related infection with coagulase-negative staphylococci, 4 of these probably as a surgical complication and 3 probably related to VCSR use. Systemic perioperative prophylaxis was administered in 22 cases, and intraventricular vancomycin and gentamicin were given in 8 cases (none of which subsequently developed an infection). Other complications included wound dehiscence, catheter malplacement, and leakage of cerebrospinal fluid. Overall, 17 VCSR were explanted due to complications. Infections were the most frequent VCSR-related complication. In our own institution, the high rate of complications led to the definition of a bundle of measures as a standard operating procedure for VCSR placement and use. Prospective studies in larger patient collectives are warranted to better identify risk factors and evaluate preventive measures such as the administration of perioperative antibiotics and the use of antimicrobial coating of catheters. Georg Thieme Verlag KG Stuttgart · New York.

  1. Unimolecular diffusion-mediated reactions with a nonrandom time-modulated absorbing barrier

    NASA Technical Reports Server (NTRS)

    Bashford, D.; Weaver, D. L.

    1986-01-01

    A diffusion-reaction model with time-dependent reactivity is formulated and applied to unimolecular reactions. The model is solved exactly numerically and approximately analytically for the unreacted fraction as a function of time. It is shown that the approximate analytical solution is valid even when the system is far from equilibrium, and when the reactivity probability is more complicated than a square-wave function of time. A discussion is also given of an approach to problems of this type using a stochastically fluctuating reactivity, and the first-passage time for a particular example is derived.

  2. Endoscopic third ventriculostomy in the treatment of childhood hydrocephalus.

    PubMed

    Kulkarni, Abhaya V; Drake, James M; Mallucci, Conor L; Sgouros, Spyros; Roth, Jonathan; Constantini, Shlomi

    2009-08-01

    To develop a model to predict the probability of endoscopic third ventriculostomy (ETV) success in the treatment for hydrocephalus on the basis of a child's individual characteristics. We analyzed 618 ETVs performed consecutively on children at 12 international institutions to identify predictors of ETV success at 6 months. A multivariable logistic regression model was developed on 70% of the dataset (training set) and validated on 30% of the dataset (validation set). In the training set, 305/455 ETVs (67.0%) were successful. The regression model (containing patient age, cause of hydrocephalus, and previous cerebrospinal fluid shunt) demonstrated good fit (Hosmer-Lemeshow, P = .78) and discrimination (C statistic = 0.70). In the validation set, 105/163 ETVs (64.4%) were successful and the model maintained good fit (Hosmer-Lemeshow, P = .45), discrimination (C statistic = 0.68), and calibration (calibration slope = 0.88). A simplified ETV Success Score was devised that closely approximates the predicted probability of ETV success. Children most likely to succeed with ETV can now be accurately identified and spared the long-term complications of CSF shunting.

  3. Cost-effectiveness of cervical spine clearance interventions with litigation and long-term-care implications in obtunded adult patients following blunt injury.

    PubMed

    Ertel, Audrey E; Robinson, Bryce R H; Eckman, Mark H

    2016-11-01

    Recent guidelines from the Eastern Association for the Surgery of Trauma conditionally recommend cervical collar removal after a negative cervical computed tomography in obtunded adult blunt trauma patients. Although the rates of missed injury are extremely low, the impact of chronic care costs and litigation upon decision making remains unclear. We hypothesize that the cost-effectiveness of strategies that include additional imaging may contradict current guidelines. A cost-effectiveness analysis was performed for a base-case 40-year-old, obtunded man with a negative computed tomography. Strategies compared included adjunct imaging with cervical magnetic resonance imaging (MRI), collar maintenance for 6 weeks, or removal. Data on the probability for long-term collar complications, spine injury, imaging costs, complications associated with MRI, acute and chronic care, and litigation were obtained from published and Medicare data. Outcomes were expressed as 2014 US dollars and quality-adjusted life-years. Collar removal was more effective and less costly than collar use or MRI (19.99 vs. 19.35 vs. 18.70 quality-adjusted life-years; $675,359 vs. $685,546 vs. $685,848) in the base-case analysis. When the probability of missed cervical injury was greater than 0.04 adjunct imaging with MRI dominated, however, collar removal remained cost-effective until the probability of missed injury exceeded 0.113 at which point collar removal exceeded the $50,000 threshold. Collar removal remained the most cost-effective approach until the probability of complications from collar use was reduced to less than 0.009, at which point collar maintenance became the most cost-effective strategy. Early collar removal dominates all strategies until the risk of complications from MRI positioning is reduced to 0.03 and remained cost-effective even when the probability of complication was reduced to 0. Early collar removal in obtunded adult blunt trauma patients appears to be the most effective and least costly strategy for cervical clearance based on the current literature available. Economic evaluation, level III; therapeutic study, level IV.

  4. Evaluation of dose response models and parameters predicting radiation induced pneumonitis using clinical data from breast cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Tsougos, Ioannis; Mavroidis, Panayiotis; Rajala, Juha; Theodorou, Kyriaki; Järvenpää, Ritva; Pitkänen, Maunu A.; Holli, Kaija; Ojala, Antti T.; Lind, Bengt K.; Hyödynmaa, Simo; Kappas, Constantin

    2005-08-01

    The purpose of this work is to evaluate the predictive strength of the relative seriality, parallel and LKB normal tissue complication probability (NTCP) models regarding the incidence of radiation pneumonitis, in a large group of patients following breast cancer radiotherapy, and furthermore, to illustrate statistical methods for examining whether certain published radiobiological parameters are compatible with a clinical treatment methodology and patient group characteristics. The study is based on 150 consecutive patients who received radiation therapy for breast cancer. For each patient, the 3D dose distribution delivered to lung and the clinical treatment outcome were available. Clinical symptoms and radiological findings, along with a patient questionnaire, were used to assess the manifestation of radiation-induced complications. Using this material, different methods of estimating the likelihood of radiation effects were evaluated. This was attempted by analysing patient data based on their full dose distributions and associating the calculated complication rates with the clinical follow-up records. Additionally, the need for an update of the criteria that are being used in the current clinical practice was also examined. The patient material was selected without any conscious bias regarding the radiotherapy treatment technique used. The treatment data of each patient were applied to the relative seriality, LKB and parallel NTCP models, using published parameter sets. Of the 150 patients, 15 experienced radiation-induced pneumonitis (grade 2) according to the radiation pneumonitis scoring criteria used. Of the NTCP models examined, the relative seriality model was able to predict the incidence of radiation pneumonitis with acceptable accuracy, although radiation pneumonitis was developed by only a few patients. In the case of modern breast radiotherapy, radiobiological modelling appears to be very sensitive to model and parameter selection giving clinically acceptable results in certain cases selectively (relative seriality model with Seppenwoolde et al (2003 Int. J. Radiat. Oncol. Biol. Phys. 55 724-35) and Gagliardi et al (2000 Int. J. Radiat. Oncol. Biol. Phys. 46 373-81) parameter sets). The use of published parameters should be considered as safe only after their examination using local clinical data. The variation of inter-patient radiosensitivity seems to play a significant role in the prediction of such low incidence rate complications. Scoring grades were combined to give stronger evidence of radiation pneumonitis since their differences could not be strictly associated with dose. This obviously reveals a weakness of the scoring related to this endpoint, and implies that the probability of radiation pneumonitis induction may be too low to be statistically analysed with high accuracy, at least with the latest advances of dose delivery in breast radiotherapy.

  5. Hospital variation and the impact of postoperative complications on the use of perioperative chemo(radio)therapy in resectable gastric cancer. Results from the Dutch Upper GI Cancer Audit.

    PubMed

    Schouwenburg, M G; Busweiler, L A D; Beck, N; Henneman, D; Amodio, S; van Berge Henegouwen, M I; Cats, A; van Hillegersberg, R; van Sandick, J W; Wijnhoven, B P L; Wouters, M W J; Nieuwenhuijzen, G A P

    2018-04-01

    Dutch national guidelines on the diagnosis and treatment of gastric cancer recommend the use of perioperative chemotherapy in patients with resectable gastric cancer. However, adjuvant chemotherapy is often not administered. The aim of this study was to evaluate hospital variation on the probability to receive adjuvant chemotherapy and to identify associated factors with special attention to postoperative complications. All patients who received neoadjuvant chemotherapy and underwent an elective surgical resection for stage IB-IVa (M0) gastric adenocarcinoma between 2011 and 2015 were identified from a national database (Dutch Upper GI Cancer Audit). A multivariable linear mixed model was used to evaluate case-mix adjusted hospital variation and to identify factors associated with adjuvant therapy. Of all surgically treated gastric cancer patients who received neoadjuvant chemotherapy (n = 882), 68% received adjuvant chemo(radio)therapy. After adjusting for case-mix and random variation, a large hospital variation in the administration rates for adjuvant was observed (OR range 0.31-7.1). In multivariable analysis, weight loss, a poor health status and failure of neoadjuvant chemotherapy completion were strongly associated with an increased likelihood of adjuvant therapy omission. Patients with severe postoperative complications had a threefold increased likelihood of adjuvant therapy omission (OR 3.07 95% CI 2.04-4.65). Despite national guidelines, considerable hospital variation was observed in the probability of receiving adjuvant chemo(radio)therapy. Postoperative complications were strongly associated with adjuvant chemo(radio)therapy omission, underlining the need to further reduce perioperative morbidity in gastric cancer surgery. Copyright © 2018 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  6. The relationship between diabetes, diabetes-related complications and productive activities among older Europeans.

    PubMed

    Rodriguez-Sanchez, B; Alessie, R J M; Feenstra, T L; Angelini, V

    2018-06-01

    To assess the impact of diabetes and diabetes-related complications on two measures of productivity for people in the labour force and out of it, namely "being afraid health limits ability to work before retirement" and "volunteering". Logistic regressions were run to test the impact of diabetes and its complications on the probability of being afraid health limits work and being a formal volunteer. The longitudinal sample for the former outcome includes 53,631 observations, clustered in 34,393 individuals, aged 50-65 years old whereas the latter consists of 45,384 observations, grouped in 29,104 individuals aged 65 and above across twelve European countries taken from the Survey of Health, Ageing and Retirement in Europe, from 2006 to 2013. Diabetes increased the probability of being afraid health limited work by nearly 11% points, adjusted by clinical complications, and reduced the likelihood of being a formal volunteer by 2.7% points, additionally adjusted by mobility problems. We also found that both the probability of being afraid health limits work and the probability of being a formal volunteer increased during and after the crisis. Moreover, having diabetes had a larger effect on being afraid health limits work during the year 2010, possibly related to the financial crisis. Our findings show that diabetes significantly affects the perception of people regarding the effects of their condition on work, increasing the fear that health limits their ability to work, especially during the crisis year 2010, as well as the participation in volunteering work among retired people.

  7. Evaluation of normal lung tissue complication probability in gated and conventional radiotherapy using the 4D XCAT digital phantom.

    PubMed

    Shahzadeh, Sara; Gholami, Somayeh; Aghamiri, Seyed Mahmood Reza; Mahani, Hojjat; Nabavi, Mansoure; Kalantari, Faraz

    2018-06-01

    The present study was conducted to investigate normal lung tissue complication probability in gated and conventional radiotherapy (RT) as a function of diaphragm motion, lesion size, and its location using 4D-XCAT digital phantom in a simulation study. Different time series of 3D-CT images were generated using the 4D-XCAT digital phantom. The binary data obtained from this phantom were then converted to the digital imaging and communication in medicine (DICOM) format using an in-house MATLAB-based program to be compatible with our treatment planning system (TPS). The 3D-TPS with superposition computational algorithm was used to generate conventional and gated plans. Treatment plans were generated for 36 different XCAT phantom configurations. These included four diaphragm motions of 20, 25, 30 and 35 mm, three lesion sizes of 3, 4, and 5 cm in diameter and each tumor was placed in four different lung locations (right lower lobe, right upper lobe, left lower lobe and left upper lobe). The complication of normal lung tissue was assessed in terms of mean lung dose (MLD), the lung volume receiving ≥20 Gy (V20), and normal tissue complication probability (NTCP). The results showed that the gated RT yields superior outcomes in terms of normal tissue complication compared to the conventional RT. For all cases, the gated radiation therapy technique reduced the mean dose, V20, and NTCP of lung tissue by up to 5.53 Gy, 13.38%, and 23.89%, respectively. The results of this study showed that the gated RT provides significant advantages in terms of the normal lung tissue complication, compared to the conventional RT, especially for the lesions near the diaphragm. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Chaotic attractors and physical measures for some density dependent Leslie population models

    NASA Astrophysics Data System (ADS)

    Ugarcovici, Ilie; Weiss, Howard

    2007-12-01

    Following ecologists' discoveries, mathematicians have begun studying extensions of the ubiquitous age structured Leslie population model that allow some survival probabilities and/or fertility rates to depend on population densities. These nonlinear extensions commonly exhibit very complicated dynamics: through computer studies, some authors have discovered robust Hénon-like strange attractors in several families. Population biologists and demographers frequently wish to average a function over many generations and conclude that the average is independent of the initial population distribution. This type of 'ergodicity' seems to be a fundamental tenet in population biology. In this paper we develop the first rigorous ergodic theoretic framework for density dependent Leslie population models. We study two generation models with Ricker and Hassell (recruitment type) fertility terms. We prove that for some parameter regions these models admit a chaotic (ergodic) attractor which supports a unique physical probability measure. This physical measure, having full Lebesgue measure basin, satisfies in the strongest possible sense the population biologist's requirement for ergodicity in their population models. We use the celebrated work of Wang and Young 2001 Commun. Math. Phys. 218 1-97, and our results are the first applications of their method to biology, ecology or demography.

  9. A prospective cohort study on radiation-induced hypothyroidism: development of an NTCP model.

    PubMed

    Boomsma, Marjolein J; Bijl, Hendrik P; Christianen, Miranda E M C; Beetz, Ivo; Chouvalova, Olga; Steenbakkers, Roel J H M; van der Laan, Bernard F A M; Wolffenbuttel, Bruce H R; Oosting, Sjoukje F; Schilstra, Cornelis; Langendijk, Johannes A

    2012-11-01

    To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism. The thyroid-stimulating hormone (TSH) level of 105 patients treated with (chemo-) radiation therapy for head-and-neck cancer was prospectively measured during a median follow-up of 2.5 years. Hypothyroidism was defined as elevated serum TSH with decreased or normal free thyroxin (T4). A multivariate logistic regression model with bootstrapping was used to determine the most important prognostic variables for radiation-induced hypothyroidism. Thirty-five patients (33%) developed primary hypothyroidism within 2 years after radiation therapy. An NTCP model based on 2 variables, including the mean thyroid gland dose and the thyroid gland volume, was most predictive for radiation-induced hypothyroidism. NTCP values increased with higher mean thyroid gland dose (odds ratio [OR]: 1.064/Gy) and decreased with higher thyroid gland volume (OR: 0.826/cm(3)). Model performance was good with an area under the curve (AUC) of 0.85. This is the first prospective study resulting in an NTCP model for radiation-induced hypothyroidism. The probability of hypothyroidism rises with increasing dose to the thyroid gland, whereas it reduces with increasing thyroid gland volume. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Validated Risk Score for Predicting 6-Month Mortality in Infective Endocarditis.

    PubMed

    Park, Lawrence P; Chu, Vivian H; Peterson, Gail; Skoutelis, Athanasios; Lejko-Zupa, Tatjana; Bouza, Emilio; Tattevin, Pierre; Habib, Gilbert; Tan, Ren; Gonzalez, Javier; Altclas, Javier; Edathodu, Jameela; Fortes, Claudio Querido; Siciliano, Rinaldo Focaccia; Pachirat, Orathai; Kanj, Souha; Wang, Andrew

    2016-04-18

    Host factors and complications have been associated with higher mortality in infective endocarditis (IE). We sought to develop and validate a model of clinical characteristics to predict 6-month mortality in IE. Using a large multinational prospective registry of definite IE (International Collaboration on Endocarditis [ICE]-Prospective Cohort Study [PCS], 2000-2006, n=4049), a model to predict 6-month survival was developed by Cox proportional hazards modeling with inverse probability weighting for surgery treatment and was internally validated by the bootstrapping method. This model was externally validated in an independent prospective registry (ICE-PLUS, 2008-2012, n=1197). The 6-month mortality was 971 of 4049 (24.0%) in the ICE-PCS cohort and 342 of 1197 (28.6%) in the ICE-PLUS cohort. Surgery during the index hospitalization was performed in 48.1% and 54.0% of the cohorts, respectively. In the derivation model, variables related to host factors (age, dialysis), IE characteristics (prosthetic or nosocomial IE, causative organism, left-sided valve vegetation), and IE complications (severe heart failure, stroke, paravalvular complication, and persistent bacteremia) were independently associated with 6-month mortality, and surgery was associated with a lower risk of mortality (Harrell's C statistic 0.715). In the validation model, these variables had similar hazard ratios (Harrell's C statistic 0.682), with a similar, independent benefit of surgery (hazard ratio 0.74, 95% CI 0.62-0.89). A simplified risk model was developed by weight adjustment of these variables. Six-month mortality after IE is ≈25% and is predicted by host factors, IE characteristics, and IE complications. Surgery during the index hospitalization is associated with lower mortality but is performed less frequently in the highest risk patients. A simplified risk model may be used to identify specific risk subgroups in IE. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  11. Performance of people with diabetes in the labor market: An empirical approach controlling for complications.

    PubMed

    Rodríguez-Sánchez, Beatriz; Cantarero-Prieto, David

    2017-11-01

    This paper introduces a framework for modelling the impact that diabetes has on employment status and wages, improving the existing literature by controlling for diabetes-related complications. Using the last wave of the Spanish National Health Survey, we find that 1710 adults out of the original sample of 36,087 have diabetes, reporting higher rates of unemployment. Our empirical results suggest that persons with diabetes, compared with non-diabetic persons, have poorer labor outcomes in terms of length of unemployment and lower income. However, diabetes is not significantly associated with unemployment probabilities, suggesting that the burden of diabetes on employment is mediated by lifestyle factors and clinical and functional complications. In addition, there are mixed outcomes to this econometric approach, depending on age and gender, among other factors. This interesting finding has several implications for research and policy on strategies to get lower health inequalities. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Model-based segmentation of abdominal aortic aneurysms in CTA images

    NASA Astrophysics Data System (ADS)

    de Bruijne, Marleen; van Ginneken, Bram; Niessen, Wiro J.; Loog, Marco; Viergever, Max A.

    2003-05-01

    Segmentation of thrombus in abdominal aortic aneurysms is complicated by regions of low boundary contrast and by the presence of many neighboring structures in close proximity to the aneurysm wall. We present an automated method that is similar to the well known Active Shape Models (ASM), combining a three-dimensional shape model with a one-dimensional boundary appearance model. Our contribution is twofold: we developed a non-parametric appearance modeling scheme that effectively deals with a highly varying background, and we propose a way of generalizing models of curvilinear structures from small training sets. In contrast with the conventional ASM approach, the new appearance model trains on both true and false examples of boundary profiles. The probability that a given image profile belongs to the boundary is obtained using k nearest neighbor (kNN) probability density estimation. The performance of this scheme is compared to that of original ASMs, which minimize the Mahalanobis distance to the average true profile in the training set. The generalizability of the shape model is improved by modeling the objects axis deformation independent of its cross-sectional deformation. A leave-one-out experiment was performed on 23 datasets. Segmentation using the kNN appearance model significantly outperformed the original ASM scheme; average volume errors were 5.9% and 46% respectively.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, L; Soldner, A; Kirk, M

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less

  14. Protons in head-and-neck cancer: bridging the gap of evidence.

    PubMed

    Ramaekers, Bram L T; Grutters, Janneke P C; Pijls-Johannesma, Madelon; Lambin, Philippe; Joore, Manuela A; Langendijk, Johannes A

    2013-04-01

    To use Normal Tissue Complication Probability (NTCP) models and comparative planning studies to explore the (cost-)effectiveness of swallowing sparing intensity modulated proton radiotherapy (IMPT) compared with swallowing sparing intensity modulated radiotherapy with photons (IMRT) in head and neck cancer (HNC). A Markov model was constructed to examine and compare the costs and quality-adjusted life years (QALYs) of the following strategies: (1) IMPT for all patients; (2) IMRT for all patients; and (3) IMPT if efficient. The assumption of equal survival for IMPT and IMRT in the base case analysis was relaxed in a sensitivity analysis. Intensity modulated proton radiation therapy and IMRT for all patients yielded 6.620 and 6.520 QALYs and cost €50,989 and €41,038, respectively. Intensity modulated proton radiation therapy if efficient yielded 6.563 QALYs and cost €43,650. The incremental cost-effectiveness ratio of IMPT if efficient versus IMRT for all patients was €60,278 per QALY gained. In the sensitivity analysis, IMRT was more effective (0.967 QALYs) and less expensive (€8218) and thus dominated IMPT for all patients. Cost-effectiveness analysis based on normal tissue complication probability models and planning studies proved feasible and informative and enables the analysis of individualized strategies. The increased effectiveness of IMPT does not seem to outweigh the higher costs for all head-and-neck cancer patients. However, when assuming equal survival among both modalities, there seems to be value in identifying those patients for whom IMPT is cost-effective. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Questioning the Relevance of Model-Based Probability Statements on Extreme Weather and Future Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2007-12-01

    We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?

  16. Can the ACS-NSQIP surgical risk calculator predict post-operative complications in patients undergoing flap reconstruction following soft tissue sarcoma resection?

    PubMed

    Slump, Jelena; Ferguson, Peter C; Wunder, Jay S; Griffin, Anthony; Hoekstra, Harald J; Bagher, Shaghayegh; Zhong, Toni; Hofer, Stefan O P; O'Neill, Anne C

    2016-10-01

    The ACS-NSQIP surgical risk calculator is an open-access on-line tool that estimates the risk of adverse post-operative outcomes for a wide range of surgical procedures. Wide surgical resection of soft tissue sarcoma (STS) often requires complex reconstructive procedures that can be associated with relatively high rates of complications. This study evaluates the ability of this calculator to identify patients with STS at risk for post-operative complications following flap reconstruction. Clinical details of 265 patients who underwent flap reconstruction following STS resection were entered into the online calculator. The predicted rates of complications were compared to the observed rates. The calculator model was validated using measures of prediction and discrimination. The mean predicted rate of any complication was 15.35 ± 5.6% which differed significantly from the observed rate of 32.5% (P = 0.009). The c-statistic was relatively low at 0.626 indicating poor discrimination between patients who are at risk of complications and those who are not. The Brier's score of 0.242 was significantly different from 0 (P < 0.001) indicating poor correlation between the predicted and actual probability of complications. The ACS-NSQIP universal risk calculator did not maintain its predictive value in patients undergoing flap reconstruction following STS resection. J. Surg. Oncol. 2016;114:570-575. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  17. Distributed Immune Systems for Wireless Network Information Assurance

    DTIC Science & Technology

    2010-04-26

    ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability

  18. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be; Van den Bergh, Laura; Al-Mamgani, Abrahim

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including themore » most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions: Comparable prediction models were obtained with LKB, RS, and logistic NTCP models. Including clinical factors improved the predictive power of all models significantly.« less

  19. Influence of different dose calculation algorithms on the estimate of NTCP for lung complications.

    PubMed

    Hedin, Emma; Bäck, Anna

    2013-09-06

    Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose-volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient-specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm-specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction-based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman-Kutcher-Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm-specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types.

  20. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    PubMed

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  1. Food, mechanic and septic complications in patients enterally nutritioned in home conditions.

    PubMed

    Kalita, Monika; Majewska, Krystyna; Gradowska, Aleksandra; Karwowska, Katarzyna; Ławiński, Michał

    2015-02-03

    Home enteral nutrition (HEN for short) allows practically normal living for patients who cannot be fed orally but at the same time do not have to stay in hospitals, which is often found to decrease their mental condition, increase of probability of complications and costs of medical treatment. The aim of the study was to analyze the frequency of nutritional, mechanical and septic complications in patients fed enterally in home conditions. The study performed using retrospective analysis of study results and reports from control visits for patients in the period between 2012-2013. 147 patients fed enterally using HEN method participated in the study, including 70 men and 77 women aged 19 to 99 years (average 65 years). The following type of gastrointestinal tract access was used for patients: PEG in 113 (76.5%), feeding jejunostomy - 21 (1.4%), PEG-PEJ - 5 (3.5%), in case of the remaining 8 patients the nasogastric gavage (5.5%) was used. The most common complication were infections (of gastric tract, skin soft tissue in the region of nutritional fistula entry, in three cases the aspiration pneumonia was diagnosed) found in 55 (49.1%) of cases. Mechanical complications were found out in 29 (25.9% of all complications), nutritional complications were present 28 times, which constituted 25% of all complications. In the studied group of patients with an implemented HEN procedure, septic complications were the most common problem. The longest average nutrition time with PEG-PEJ probably results from the effective protection of the patient against aspiration pneumonia.

  2. Single-cell-based computer simulation of the oxygen-dependent tumour response to irradiation

    NASA Astrophysics Data System (ADS)

    Harting, Christine; Peschke, Peter; Borkenstein, Klaus; Karger, Christian P.

    2007-08-01

    Optimization of treatment plans in radiotherapy requires the knowledge of tumour control probability (TCP) and normal tissue complication probability (NTCP). Mathematical models may help to obtain quantitative estimates of TCP and NTCP. A single-cell-based computer simulation model is presented, which simulates tumour growth and radiation response on the basis of the response of the constituting cells. The model contains oxic, hypoxic and necrotic tumour cells as well as capillary cells which are considered as sources of a radial oxygen profile. Survival of tumour cells is calculated by the linear quadratic model including the modified response due to the local oxygen concentration. The model additionally includes cell proliferation, hypoxia-induced angiogenesis, apoptosis and resorption of inactivated tumour cells. By selecting different degrees of angiogenesis, the model allows the simulation of oxic as well as hypoxic tumours having distinctly different oxygen distributions. The simulation model showed that poorly oxygenated tumours exhibit an increased radiation tolerance. Inter-tumoural variation of radiosensitivity flattens the dose response curve. This effect is enhanced by proliferation between fractions. Intra-tumoural radiosensitivity variation does not play a significant role. The model may contribute to the mechanistic understanding of the influence of biological tumour parameters on TCP. It can in principle be validated in radiation experiments with experimental tumours.

  3. A simple model for prediction postpartum PTSD in high-risk pregnancies.

    PubMed

    Shlomi Polachek, Inbal; Dulitzky, Mordechai; Margolis-Dorfman, Lilia; Simchen, Michal J

    2016-06-01

    This study aimed to examine the prevalence and possible antepartum risk factors of complete and partial post-traumatic stress disorder (PTSD) among women with complicated pregnancies and to define a predictive model for postpartum PTSD in this population. Women attending the high-risk pregnancy outpatient clinics at Sheba Medical Center completed the Edinburgh Postnatal Depression Scale (EPDS) and a questionnaire regarding demographic variables, history of psychological and psychiatric treatment, previous trauma, previous childbirth, current pregnancy medical and emotional complications, fears from childbirth, and expected pain. One month after delivery, women were requested to repeat the EPDS and complete the Post-traumatic Stress Diagnostic Scale (PDS) via telephone interview. The prevalence rates of postpartum PTSD (9.9 %) and partial PTSD (11.9 %) were relatively high. PTSD and partial PTSD were associated with sadness or anxiety during past pregnancy or childbirth, previous very difficult birth experiences, preference for cesarean section in future childbirth, emotional crises during pregnancy, increased fear of childbirth, higher expected intensity of pain, and depression during pregnancy. We created a prediction model for postpartum PTSD which shows a linear growth in the probability for developing postpartum PTSD when summing these seven antenatal risk factors. Postpartum PTSD is extremely prevalent after complicated pregnancies. A simple questionnaire may aid in identifying at-risk women before childbirth. This presents a potential for preventing or minimizing postpartum PTSD in this population.

  4. Radiobiological concepts for treatment planning of schemes that combines external beam radiotherapy and systemic targeted radiotherapy

    NASA Astrophysics Data System (ADS)

    Fabián Calderón Marín, Carlos; González González, Joaquín Jorge; Laguardia, Rodolfo Alfonso

    2017-09-01

    The combination of radiotherapy modalities with external bundles and systemic radiotherapy (CIERT) could be a reliable alternative for patients with multiple lesions or those where treatment planning maybe difficult because organ(s)-at-risk (OARs) constraints. Radiobiological models should have the capacity for predicting the biological irradiation response considering the differences in the temporal pattern of dose delivering in both modalities. Two CIERT scenarios were studied: sequential combination in which one modality is executed following the other one and concurrent combination when both modalities are running simultaneously. Expressions are provided for calculation of the dose-response magnitudes Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP). General results on radiobiological modeling using the linear-quadratic (LQ) model are also discussed. Inter-subject variation of radiosensitivity and volume irradiation effect in CIERT are studied. OARs should be under control during the planning in concurrent CIERT treatment as the administered activity is increased. The formulation presented here may be used for biological evaluation of prescriptions and biological treatment planning of CIERT schemes in clinical situation.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neugent, Kathryn F.; Massey, Philip; Skiff, Brian

    Due to their transitionary nature, yellow supergiants (YSGs) provide a critical challenge for evolutionary modeling. Previous studies within M31 and the Small Magellanic Cloud show that the Geneva evolutionary models do a poor job at predicting the lifetimes of these short-lived stars. Here, we extend this study to the Large Magellanic Cloud (LMC) while also investigating the galaxy's red supergiant (RSG) content. This task is complicated by contamination by Galactic foreground stars that color and magnitude criteria alone cannot weed out. Therefore, we use proper-motions and the LMC's large systemic radial velocity ({approx}278 km s{sup -1}) to separate out thesemore » foreground dwarfs. After observing nearly 2000 stars, we identified 317 probable YSGs, 6 possible YSGs, and 505 probable RSGs. Foreground contamination of our YSG sample was {approx}80%, while that of the RSG sample was only 3%. By placing the YSGs on the Hertzsprung-Russell diagram and comparing them against the evolutionary tracks, we find that new Geneva evolutionary models do an exemplary job at predicting both the locations and the lifetimes of these transitory objects.« less

  6. Impact of Chemotherapy on Normal Tissue Complication Probability Models of Acute Hematologic Toxicity in Patients Receiving Pelvic Intensity Modulated Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazan, Jose G.; Luxton, Gary; Kozak, Margaret M.

    Purpose: To determine how chemotherapy agents affect radiation dose parameters that correlate with acute hematologic toxicity (HT) in patients treated with pelvic intensity modulated radiation therapy (P-IMRT) and concurrent chemotherapy. Methods and Materials: We assessed HT in 141 patients who received P-IMRT for anal, gynecologic, rectal, or prostate cancers, 95 of whom received concurrent chemotherapy. Patients were separated into 4 groups: mitomycin (MMC) + 5-fluorouracil (5FU, 37 of 141), platinum ± 5FU (Cis, 32 of 141), 5FU (26 of 141), and P-IMRT alone (46 of 141). The pelvic bone was contoured as a surrogate for pelvic bone marrow (PBM) andmore » divided into subsites: ilium, lower pelvis, and lumbosacral spine (LSS). The volumes of each region receiving 5-40 Gy were calculated. The endpoint for HT was grade ≥3 (HT3+) leukopenia, neutropenia or thrombocytopenia. Normal tissue complication probability was calculated using the Lyman-Kutcher-Burman model. Logistic regression was used to analyze association between HT3+ and dosimetric parameters. Results: Twenty-six patients experienced HT3+: 10 of 37 (27%) MMC, 14 of 32 (44%) Cis, 2 of 26 (8%) 5FU, and 0 of 46 P-IMRT. PBM dosimetric parameters were correlated with HT3+ in the MMC group but not in the Cis group. LSS dosimetric parameters were well correlated with HT3+ in both the MMC and Cis groups. Constrained optimization (0« less

  7. Optimal Symmetric Multimodal Templates and Concatenated Random Forests for Supervised Brain Tumor Segmentation (Simplified) with ANTsR.

    PubMed

    Tustison, Nicholas J; Shrinidhi, K L; Wintermark, Max; Durst, Christopher R; Kandel, Benjamin M; Gee, James C; Grossman, Murray C; Avants, Brian B

    2015-04-01

    Segmenting and quantifying gliomas from MRI is an important task for diagnosis, planning intervention, and for tracking tumor changes over time. However, this task is complicated by the lack of prior knowledge concerning tumor location, spatial extent, shape, possible displacement of normal tissue, and intensity signature. To accommodate such complications, we introduce a framework for supervised segmentation based on multiple modality intensity, geometry, and asymmetry feature sets. These features drive a supervised whole-brain and tumor segmentation approach based on random forest-derived probabilities. The asymmetry-related features (based on optimal symmetric multimodal templates) demonstrate excellent discriminative properties within this framework. We also gain performance by generating probability maps from random forest models and using these maps for a refining Markov random field regularized probabilistic segmentation. This strategy allows us to interface the supervised learning capabilities of the random forest model with regularized probabilistic segmentation using the recently developed ANTsR package--a comprehensive statistical and visualization interface between the popular Advanced Normalization Tools (ANTs) and the R statistical project. The reported algorithmic framework was the top-performing entry in the MICCAI 2013 Multimodal Brain Tumor Segmentation challenge. The challenge data were widely varying consisting of both high-grade and low-grade glioma tumor four-modality MRI from five different institutions. Average Dice overlap measures for the final algorithmic assessment were 0.87, 0.78, and 0.74 for "complete", "core", and "enhanced" tumor components, respectively.

  8. Dosimetry in nuclear medicine therapy: radiobiology application and results.

    PubMed

    Strigari, L; Benassi, M; Chiesa, C; Cremonesi, M; Bodei, L; D'Andrea, M

    2011-04-01

    The linear quadratic model (LQM) has largely been used to assess the radiobiological damage to tissue by external beam fractionated radiotherapy and more recently has been extended to encompass a general continuous time varying dose rate protocol such as targeted radionuclide therapy (TRT). In this review, we provide the basic aspects of radiobiology, from a theoretical point of view, starting from the "four Rs" of radiobiology and introducing the biologically effective doses, which may be used to quantify the impact of a treatment on both tumors and normal tissues. We also present the main parameters required in the LQM, and illustrate the main models of tumor control probability and normal tissue complication probability and summarize the main dose-effect responses, reported in literature, which demonstrate the tentative link between targeted radiotherapy doses and those used in conventional radiotherapy. A better understanding of the radiobiology and mechanisms of action of TRT could contribute to describe the clinical data and guide the development of future compounds and the designing of prospective clinical trials.

  9. Parameter Estimation for Geoscience Applications Using a Measure-Theoretic Approach

    NASA Astrophysics Data System (ADS)

    Dawson, C.; Butler, T.; Mattis, S. A.; Graham, L.; Westerink, J. J.; Vesselinov, V. V.; Estep, D.

    2016-12-01

    Effective modeling of complex physical systems arising in the geosciences is dependent on knowing parameters which are often difficult or impossible to measure in situ. In this talk we focus on two such problems, estimating parameters for groundwater flow and contaminant transport, and estimating parameters within a coastal ocean model. The approach we will describe, proposed by collaborators D. Estep, T. Butler and others, is based on a novel stochastic inversion technique based on measure theory. In this approach, given a probability space on certain observable quantities of interest, one searches for the sets of highest probability in parameter space which give rise to these observables. When viewed as mappings between sets, the stochastic inversion problem is well-posed in certain settings, but there are computational challenges related to the set construction. We will focus the talk on estimating scalar parameters and fields in a contaminant transport setting, and in estimating bottom friction in a complicated near-shore coastal application.

  10. A hierarchical approach to reliability modeling of fault-tolerant systems. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Gossman, W. E.

    1986-01-01

    A methodology for performing fault tolerant system reliability analysis is presented. The method decomposes a system into its subsystems, evaluates vent rates derived from the subsystem's conditional state probability vector and incorporates those results into a hierarchical Markov model of the system. This is done in a manner that addresses failure sequence dependence associated with the system's redundancy management strategy. The method is derived for application to a specific system definition. Results are presented that compare the hierarchical model's unreliability prediction to that of a more complicated tandard Markov model of the system. The results for the example given indicate that the hierarchical method predicts system unreliability to a desirable level of accuracy while achieving significant computational savings relative to component level Markov model of the system.

  11. Radiobiological Impact of Planning Techniques for Prostate Cancer in Terms of Tumor Control Probability and Normal Tissue Complication Probability

    PubMed Central

    Rana, S; Cheng, CY

    2014-01-01

    Background: The radiobiological models describe the effects of the radiation treatment on cancer and healthy cells, and the radiobiological effects are generally characterized by the tumor control probability (TCP) and normal tissue complication probability (NTCP). Aim: The purpose of this study was to assess the radiobiological impact of RapidArc planning techniques for prostate cancer in terms of TCP and normal NTCP. Subjects and Methods: A computed tomography data set of ten cases involving low-risk prostate cancer was selected for this retrospective study. For each case, two RapidArc plans were created in Eclipse treatment planning system. The double arc (DA) plan was created using two full arcs and the single arc (SA) plan was created using one full arc. All treatment plans were calculated with anisotropic analytical algorithm. Radiobiological modeling response evaluation was performed by calculating Niemierko's equivalent uniform dose (EUD)-based Tumor TCP and NTCP values. Results: For prostate tumor, the average EUD in the SA plans was slightly higher than in the DA plans (78.10 Gy vs. 77.77 Gy; P = 0.01), but the average TCP was comparable (98.3% vs. 98.3%; P = 0.01). In comparison to the DA plans, the SA plans produced higher average EUD to bladder (40.71 Gy vs. 40.46 Gy; P = 0.03) and femoral heads (10.39 Gy vs. 9.40 Gy; P = 0.03), whereas both techniques produced NTCP well below 0.1% for bladder (P = 0.14) and femoral heads (P = 0.26). In contrast, the SA plans produced higher average NTCP compared to the DA plans (2.2% vs. 1.9%; P = 0.01). Furthermore, the EUD to rectum was slightly higher in the SA plans (62.88 Gy vs. 62.22 Gy; P = 0.01). Conclusion: The SA and DA techniques produced similar TCP for low-risk prostate cancer. The NTCP for femoral heads and bladder was comparable in the SA and DA plans; however, the SA technique resulted in higher NTCP for rectum in comparison with the DA technique. PMID:24761232

  12. A new plan-scoring method using normal tissue complication probability for personalized treatment plan decisions in prostate cancer

    NASA Astrophysics Data System (ADS)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan

    2018-01-01

    The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.

  13. Correcting for dependent censoring in routine outcome monitoring data by applying the inverse probability censoring weighted estimator.

    PubMed

    Willems, Sjw; Schat, A; van Noorden, M S; Fiocco, M

    2018-02-01

    Censored data make survival analysis more complicated because exact event times are not observed. Statistical methodology developed to account for censored observations assumes that patients' withdrawal from a study is independent of the event of interest. However, in practice, some covariates might be associated to both lifetime and censoring mechanism, inducing dependent censoring. In this case, standard survival techniques, like Kaplan-Meier estimator, give biased results. The inverse probability censoring weighted estimator was developed to correct for bias due to dependent censoring. In this article, we explore the use of inverse probability censoring weighting methodology and describe why it is effective in removing the bias. Since implementing this method is highly time consuming and requires programming and mathematical skills, we propose a user friendly algorithm in R. Applications to a toy example and to a medical data set illustrate how the algorithm works. A simulation study was carried out to investigate the performance of the inverse probability censoring weighted estimators in situations where dependent censoring is present in the data. In the simulation process, different sample sizes, strengths of the censoring model, and percentages of censored individuals were chosen. Results show that in each scenario inverse probability censoring weighting reduces the bias induced in the traditional Kaplan-Meier approach where dependent censoring is ignored.

  14. Cost-Effectiveness of Histamine2 Receptor Antagonists Versus Proton Pump Inhibitors for Stress Ulcer Prophylaxis in Critically Ill Patients.

    PubMed

    Hammond, Drayton A; Kathe, Niranjan; Shah, Anuj; Martin, Bradley C

    2017-01-01

    To determine the cost-effectiveness of stress ulcer prophylaxis with histamine 2 receptor antagonists (H2RAs) versus proton pump inhibitors (PPIs) in critically ill and mechanically ventilated adults. A decision analytic model estimating the costs and effectiveness of stress ulcer prophylaxis (with H2RAs and PPIs) from a health care institutional perspective. Adult mixed intensive care unit (ICU) population who received an H2RA or PPI for up to 9 days. Effectiveness measures were mortality during the ICU stay and complication rate. Costs (2015 U.S. dollars) were combined to include medication regimens and untoward events associated with stress ulcer prophylaxis (pneumonia, Clostridium difficile infection, and stress-related mucosal bleeding). Costs and probabilities for complications and mortality from complications came from randomized controlled trials and observational studies. A base case scenario was developed with pooled data from an observational study and meta-analysis of randomized controlled trials. Scenarios based on observational and meta-analysis data alone were evaluated. Outcomes were expected and incremental costs, mortalities, and complication rates. Univariate sensitivity analyses were conducted to determine the influence of inputs on cost, mortality, and complication rates. Monte Carlo simulations evaluated second-order uncertainty. In the base case scenario, the costs, complication rates, and mortality rates were $9039, 17.6%, and 2.50%, respectively, for H2RAs and $11,249, 22.0%, and 3.34%, respectively, for PPIs, indicating that H2RAs dominated PPIs. The observational study-based model provided similar results; however, in the meta-analysis-based model, H2RAs had a cost of $8364 and mortality rate of 3.2% compared with $7676 and 2.0%, respectively, for PPIs. At a willingness-to-pay threshold of $100,000/death averted, H2RA therapy was superior or preferred 70.3% in the base case and 97.0% in the observational study-based scenario. PPI therapy was preferred 87.2% in the meta-analysis-based scenario. Providing stress ulcer prophylaxis with H2RA therapy may reduce costs, increase survival, and avoid complications compared with PPI therapy. This finding is highly sensitive to the pneumonia and stress-related mucosal bleeding rates and whether observational data are used to inform the model. © 2016 Pharmacotherapy Publications, Inc.

  15. Quantum Probability -- A New Direction for Modeling in Cognitive Science

    NASA Astrophysics Data System (ADS)

    Roy, Sisir

    2014-07-01

    Human cognition is still a puzzling issue in research and its appropriate modeling. It depends on how the brain behaves at that particular instance and identifies and responds to a signal among myriads of noises that are present in the surroundings (called external noise) as well as in the neurons themselves (called internal noise). Thus it is not surprising to assume that the functionality consists of various uncertainties, possibly a mixture of aleatory and epistemic uncertainties. It is also possible that a complicated pathway consisting of both types of uncertainties in continuum play a major role in human cognition. For more than 200 years mathematicians and philosophers have been using probability theory to describe human cognition. Recently in several experiments with human subjects, violation of traditional probability theory has been clearly revealed in plenty of cases. Literature survey clearly suggests that classical probability theory fails to model human cognition beyond a certain limit. While the Bayesian approach may seem to be a promising candidate to this problem, the complete success story of Bayesian methodology is yet to be written. The major problem seems to be the presence of epistemic uncertainty and its effect on cognition at any given time. Moreover the stochasticity in the model arises due to the unknown path or trajectory (definite state of mind at each time point), a person is following. To this end a generalized version of probability theory borrowing ideas from quantum mechanics may be a plausible approach. A superposition state in quantum theory permits a person to be in an indefinite state at each point of time. Such an indefinite state allows all the states to have the potential to be expressed at each moment. Thus a superposition state appears to be able to represent better, the uncertainty, ambiguity or conflict experienced by a person at any moment demonstrating that mental states follow quantum mechanics during perception and cognition of ambiguous figures.

  16. Charge-patterning phase transition on a surface lattice of titratable sites adjacent to an electrolyte solution

    NASA Astrophysics Data System (ADS)

    Shore, Joel; Thurston, George

    We discuss a model for a charge-patterning phase transition on a two-dimensional square lattice of titratable sites, here regarded as protonation sites, placed on a square lattice in a dielectric medium just below the planar interface between this medium and an aqueous salt solution. Within Debye-Huckel theory, the analytical form of the electrostatic repulsion between protonated sites exhibits an approximate inverse cubic power-law decrease beyond short distances. The problem can thus be mapped onto the two-dimensional antiferromagnetic Ising model with this longer-range interaction, which we study with Monte Carlo simulations. As we increase pH, the occupation probability of a site decreases from 1 at low pH to 0 at high pH. For sufficiently-strong interaction strengths, a phase transition occurs as the occupation probability of 1/2 is approached: the charges arrange themselves into a checkerboard pattern. This ordered phase persists over a range of pH until a transition occurs back to a disordered state. This state is the analogue of the Neel state in the antiferromagnetic Ising spin model. More complicated ordered phases are expected for sufficiently strong interactions (with occupation probabilities of 1/4 and 3/4) and if the lattice is triangular rather than square. This work was supported by NIH EY018249 (GMT).

  17. A Prospective Cohort Study on Radiation-induced Hypothyroidism: Development of an NTCP Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boomsma, Marjolein J.; Bijl, Hendrik P.; Christianen, Miranda E.M.C.

    Purpose: To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism. Methods and Materials: The thyroid-stimulating hormone (TSH) level of 105 patients treated with (chemo-) radiation therapy for head-and-neck cancer was prospectively measured during a median follow-up of 2.5 years. Hypothyroidism was defined as elevated serum TSH with decreased or normal free thyroxin (T4). A multivariate logistic regression model with bootstrapping was used to determine the most important prognostic variables for radiation-induced hypothyroidism. Results: Thirty-five patients (33%) developed primary hypothyroidism within 2 years after radiation therapy. An NTCP model based on 2 variables, including the mean thyroidmore » gland dose and the thyroid gland volume, was most predictive for radiation-induced hypothyroidism. NTCP values increased with higher mean thyroid gland dose (odds ratio [OR]: 1.064/Gy) and decreased with higher thyroid gland volume (OR: 0.826/cm{sup 3}). Model performance was good with an area under the curve (AUC) of 0.85. Conclusions: This is the first prospective study resulting in an NTCP model for radiation-induced hypothyroidism. The probability of hypothyroidism rises with increasing dose to the thyroid gland, whereas it reduces with increasing thyroid gland volume.« less

  18. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    PubMed

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  19. Cost-utility comparison of neoadjuvant chemotherapy versus primary debulking surgery for treatment of advanced-stage ovarian cancer in patients 65 years old or older.

    PubMed

    Rowland, Michelle R; Lesnock, Jamie L; Farris, Coreen; Kelley, Joseph L; Krivak, Thomas C

    2015-06-01

    Treatment for advanced-stage epithelial ovarian cancer (AEOC) includes primary debulking surgery (PDS) or neoadjuvant chemotherapy (NACT). A randomized controlled trial comparing these treatments resulted in comparable overall survival (OS). Studies report more complications and lower chemotherapy completion rates in patients 65 years old or older receiving PDS. We sought to evaluate the cost implications of NACT relative to PDS in AEOC patients 65 years old or older. A 5 year Markov model was created. Arm 1 modeled PDS followed by 6 cycles of carboplatin and paclitaxel (CT). Arm 2 modeled 3 cycles of CT, followed by interval debulking surgery and then 3 additional cycles of CT. Parameters included OS, surgical complications, probability of treatment initiation, treatment cost, and quality of life (QOL). OS was assumed to be equal based on the findings of the international randomized control trial. Differences in surgical complexity were accounted for in base surgical cost plus add-on procedure costs weighted by occurrence rates. Hospital cost was a weighted average of diagnosis-related group costs weighted by composite estimates of complication rates. Sensitivity analyses were performed. Assuming equal survival, NACT produces a cost savings of $5616. If PDS improved median OS by 1.5 months or longer, PDS would be cost effective (CE) at a $100,000/quality-adjusted life-year threshold. If PDS improved OS by 3.2 months or longer, it would be CE at a $50,000 threshold. The model was robust to variation in costs and complication rates. Moderate decreases in the QOL with NACT would result in PDS being CE. A model based on the RCT comparing NACT and PDS showed NACT is a cost-saving treatment compared with PDS for AEOC in patients 65 years old or older. Small increases in OS with PDS or moderate declines in QOL with NACT would result in PDS being CE at the $100,000/quality-adjusted life-year threshold. Our results support further evaluation of the effects of PDS on OS, QOL and complications in AEOC patients 65 years old or older. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Interpretation of the electric fields measured in an ionospheric critical ionization velocity experiment

    NASA Technical Reports Server (NTRS)

    Brenning, N.; Faelthammar, C.-G.; Marklund, G.; Haerendel, G.; Kelley, M. C.; Pfaff, R.

    1991-01-01

    The quasi-dc electric fields measured in the CRIT I ionospheric release experiment are studied. In the experiment, two identical barium shaped charges were fired toward a main payload, and three-dimensional measurements of the electric field inside the streams were made. The relevance of proposed mechanisms for electron heating in the critical ionization velocity (CIV) mechanism is addressed. It is concluded that both the 'homogeneous' and the 'ionizing front' models probably are valid, but in different parts of the streams. It is also possible that electrons are directly accelerated by a magnetic field-aligned component of the electric field. The coupling between the ambient ionosphere and the ionized barium stream is more complicated that is usually assumed in CIV theories, with strong magnetic-field-aligned electric fields and probably current limitation as important processes.

  1. Simulation of target interpretation based on infrared image features and psychology principle

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Gao, Hong-sheng; Wang, Zhan-feng; Wang, Ji-jun; Su, Rong-hua; Huang, Yan-ping

    2009-07-01

    It's an important and complicated process in target interpretation that target features extraction and identification, which effect psychosensorial quantity of interpretation person to target infrared image directly, and decide target viability finally. Using statistical decision theory and psychology principle, designing four psychophysical experiment, the interpretation model of the infrared target is established. The model can get target detection probability by calculating four features similarity degree between target region and background region, which were plotted out on the infrared image. With the verification of a great deal target interpretation in practice, the model can simulate target interpretation and detection process effectively, get the result of target interpretation impersonality, which can provide technique support for target extraction, identification and decision-making.

  2. Extrapolation of Normal Tissue Complication Probability for Different Fractionations in Liver Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tai An; Erickson, Beth; Li, X. Allen

    2009-05-01

    Purpose: The ability to predict normal tissue complication probability (NTCP) is essential for NTCP-based treatment planning. The purpose of this work is to estimate the Lyman NTCP model parameters for liver irradiation from published clinical data of different fractionation regimens. A new expression of normalized total dose (NTD) is proposed to convert NTCP data between different treatment schemes. Method and Materials: The NTCP data of radiation- induced liver disease (RILD) from external beam radiation therapy for primary liver cancer patients were selected for analysis. The data were collected from 4 institutions for tumor sizes in the range of of 8-10more » cm. The dose per fraction ranged from 1.5 Gy to 6 Gy. A modified linear-quadratic model with two components corresponding to radiosensitive and radioresistant cells in the normal liver tissue was proposed to understand the new NTD formalism. Results: There are five parameters in the model: TD{sub 50}, m, n, {alpha}/{beta} and f. With two parameters n and {alpha}/{beta} fixed to be 1.0 and 2.0 Gy, respectively, the extracted parameters from the fitting are TD{sub 50}(1) = 40.3 {+-} 8.4Gy, m =0.36 {+-} 0.09, f = 0.156 {+-} 0.074 Gy and TD{sub 50}(1) = 23.9 {+-} 5.3Gy, m = 0.41 {+-} 0.15, f = 0.0 {+-} 0.04 Gy for patients with liver cirrhosis scores of Child-Pugh A and Child-Pugh B, respectively. The fitting results showed that the liver cirrhosis score significantly affects fractional dose dependence of NTD. Conclusion: The Lyman parameters generated presently and the new form of NTD may be used to predict NTCP for treatment planning of innovative liver irradiation with different fractionations, such as hypofractioned stereotactic body radiation therapy.« less

  3. Uterine Artery Embolization in 101 Cases of Uterine Fibroids: Do Size, Location, and Number of Fibroids Affect Therapeutic Success and Complications?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firouznia, Kavous, E-mail: k_firouznia@yahoo.com; Ghanaati, Hossein; Sanaati, Mina

    The purpose of this study was to evaluate whether the size, location, or number of fibroids affects therapeutic efficacy or complications of uterine artery embolization (UAE). Patients with symptomatic uterine fibroids (n = 101) were treated by selective bilateral UAE using 500- to 710-{mu}m polyvinyl alcohol (PVA) particles. Baseline measures of clinical symptoms, sonography, and MRI taken before the procedure were compared to those taken 1, 3, 6, and 12 months later. Complications and outcomes were analyzed for associations with fibroid size, location, and number. Reductions in mean fibroid volume were similar in patients with single (66.6 {+-} 21.5%) andmore » multiple (67.4 {+-} 25.0%) fibroids (p-value = 0.83). Menstrual improvement occurred in patients with single (93.3%) and multiple (72.2%) fibroids (p = 0.18). Changes in submucosal and other fibroids were not significantly different between the two groups (p's > 0.56). Linear regression analysis between primary fibroid volume as independent variable and percentage reduction of fibroid volume after 1 year yielded an R{sup 2} of 0.083 and the model coefficient was not statistically significant (p = 0.072). Multivariate regression models revealed no statistically or clinically significant coefficients or odds ratios for three independent variables (primary fibroid size, total number, and fibroid location) and all outcome variables (percent reduction of uterus and fibroid volumes in 1 year, improvement of clinical symptoms [menstrual, bulk related, and urinary] in 1 year, and complications after UAE). In conclusion, neither the success rate nor the probability of complications was affected by the primary fibroid size, location, or total number of fibroids.« less

  4. Bayesian network representing system dynamics in risk analysis of nuclear systems

    NASA Astrophysics Data System (ADS)

    Varuttamaseni, Athi

    2011-12-01

    A dynamic Bayesian network (DBN) model is used in conjunction with the alternating conditional expectation (ACE) regression method to analyze the risk associated with the loss of feedwater accident coupled with a subsequent initiation of the feed and bleed operation in the Zion-1 nuclear power plant. The use of the DBN allows the joint probability distribution to be factorized, enabling the analysis to be done on many simpler network structures rather than on one complicated structure. The construction of the DBN model assumes conditional independence relations among certain key reactor parameters. The choice of parameter to model is based on considerations of the macroscopic balance statements governing the behavior of the reactor under a quasi-static assumption. The DBN is used to relate the peak clad temperature to a set of independent variables that are known to be important in determining the success of the feed and bleed operation. A simple linear relationship is then used to relate the clad temperature to the core damage probability. To obtain a quantitative relationship among different nodes in the DBN, surrogates of the RELAP5 reactor transient analysis code are used. These surrogates are generated by applying the ACE algorithm to output data obtained from about 50 RELAP5 cases covering a wide range of the selected independent variables. These surrogates allow important safety parameters such as the fuel clad temperature to be expressed as a function of key reactor parameters such as the coolant temperature and pressure together with important independent variables such as the scram delay time. The time-dependent core damage probability is calculated by sampling the independent variables from their probability distributions and propagate the information up through the Bayesian network to give the clad temperature. With the knowledge of the clad temperature and the assumption that the core damage probability has a one-to-one relationship to it, we have calculated the core damage probably as a function of transient time. The use of the DBN model in combination with ACE allows risk analysis to be performed with much less effort than if the analysis were done using the standard techniques.

  5. Which factor is most important for occurrence of cutout complications in patients treated with proximal femoral nail antirotation? Retrospective analysis of 298 patients.

    PubMed

    Turgut, Ali; Kalenderer, Önder; Karapınar, Levent; Kumbaracı, Mert; Akkan, Hasan Ali; Ağuş, Haluk

    2016-05-01

    Mechanical complications, such as cut-out of the head-neck fixation device, are the most common causes of morbidity after trochanteric femur fracture treatment. The causes of cut-out complications are well defined in patients who are treated with sliding hip screws and biaxial cephalomedullary nails but there are few reports about the patients who are treated with proximal femoral nail antirotation. The purpose of this study was to evaluate the most important factor about occurance of cutout complication and also to evaluate the risks of the combination of each possible factors. Overally 298 patients were enrolled in the study. Medical records were reviewed for patients' age, fracture type, gender, anesthesia type and occurance of cut-out complication. Postoperatively taken radiographs were reviewed for tip-apex distance, obtained collo-diaphyseal angle, the quadrant of the helical blade and Ikuta reduction subgroup. The most important factor (s) and also predicted probability of cut-out complication was calculated for each combination of factors. Cut-out complication was observed in 14 patients (4.7 %). The most important factor about occurrence of the cut-out complication was found as varus reduction (p: 0.01), the second important factor was found as implantation of the helical blade in the improper quadrant (p: 0.02). Tip-apex distance was found as third important factor (p: 0.10). The predicted probability of cut-out complication was calculated as 45.6 % when whole of the four surgeon dependent factors were improperly obtained. Althought obtaining proper tip-apex distance is important to prevent cutout complication in these fractures, if the fracture is not reduced in varus position and helical blade is inserted in the proper quadrant, possibility of cut-out complication is very low even in the patients with high tip-apex distance.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, M; Choi, E; Chuong, M

    Purpose: To evaluate weather the current radiobiological models can predict the normal liver complications of radioactive Yttrium-90 ({sup 90}Y) selective-internal-radiation-treatment (SIRT) for metastatic liver lesions based on the post-infusion {sup 90}Y PET images. Methods: A total of 20 patients with metastatic liver tumors treated with SIRT that received a post-infusion {sup 90}Y-PET/CT scan were analyzed in this work. The 3D activity distribution of the PET images was converted into a 3D dose distribution via a kernel convolution process. The physical dose distribution was converted into the equivalent dose (EQ2) delivered at 2 Gy based on the linear-quadratic (LQ) model consideringmore » the dose rate effect. The biological endpoint of this work was radiation-induce liver disease (RILD). The NTCPs were calculated with four different repair-times (T1/2-Liver-Repair= 0,0.5,1.0,2.0 hr) and three published NTCP models (Lyman-external-RT, Lyman 90Y-HCC-SIRT, parallel model) were compared to the incidence of RILD of the recruited patients to evaluate their ability of outcome prediction. Results: The mean normal liver physical dose (avg. 51.9 Gy, range 31.9–69.8 Gy) is higher than the suggested liver dose constraint for external beam treatment (∼30 Gy). However, none of the patients in our study developed RILD after the SIRT. The estimated probability of ‘no patient developing RILD’ obtained from the two Lyman models are 46.3% to 48.3% (T1/2-Liver-Repair= 0hr) and <1% for all other repair times. For the parallel model, the estimated probability is 97.3% (0hr), 51.7% (0.5hr), 2.0% (1.0hr) and <1% (2.0hr). Conclusion: Molecular-images providing the distribution of {sup 90}Y enable the dose-volume based dose/outcome analysis for SIRT. Current NTCP models fail to predict RILD complications in our patient population, unless a very short repair-time for the liver is assumed. The discrepancy between the Lyman {sup 90}Y-HCC-SIRT model predicted and the clinically observed outcomes further demonstrates the need of an NTCP model specific to the metastatic liver SIRT.« less

  7. Uveitis and Systemic Inflammatory Markers in Convalescent Phase of Ebola Virus Disease.

    PubMed

    Chancellor, John R; Padmanabhan, Sriranjani P; Greenough, Thomas C; Sacra, Richard; Ellison, Richard T; Madoff, Lawrence C; Droms, Rebecca J; Hinkle, David M; Asdourian, George K; Finberg, Robert W; Stroher, Ute; Uyeki, Timothy M; Cerón, Olga M

    2016-02-01

    We report a case of probable Zaire Ebola virus-related ophthalmologic complications in a physician from the United States who contracted Ebola virus disease in Liberia. Uveitis, immune activation, and nonspecific increase in antibody titers developed during convalescence. This case highlights immune phenomena that could complicate management of Ebola virus disease-related uveitis during convalescence.

  8. CT findings of acute cholecystitis and its complications.

    PubMed

    Shakespear, Jonathan S; Shaaban, Akram M; Rezvani, Maryam

    2010-06-01

    The purpose of this article is to describe and illustrate the CT findings of acute cholecystitis and its complications. CT findings suggesting acute cholecystitis should be interpreted with caution and should probably serve as justification for further investigation with abdominal ultrasound. CT has a relatively high negative predictive value, and acute cholecystitis is unlikely in the setting of a negative CT. Complications of acute cholecystitis have a characteristic CT appearance and include necrosis, perforation, abscess formation, intraluminal hemorrhage, and wall emphysema.

  9. On the use of published radiobiological parameters and the evaluation of NTCP models regarding lung pneumonitis in clinical breast radiotherapy.

    PubMed

    Svolos, Patricia; Tsougos, Ioannis; Kyrgias, Georgios; Kappas, Constantine; Theodorou, Kiki

    2011-04-01

    In this study we sought to evaluate and accent the importance of radiobiological parameter selection and implementation to the normal tissue complication probability (NTCP) models. The relative seriality (RS) and the Lyman-Kutcher-Burman (LKB) models were studied. For each model, a minimum and maximum set of radiobiological parameter sets was selected from the overall published sets applied in literature and a theoretical mean parameter set was computed. In order to investigate the potential model weaknesses in NTCP estimation and to point out the correct use of model parameters, these sets were used as input to the RS and the LKB model, estimating radiation induced complications for a group of 36 breast cancer patients treated with radiotherapy. The clinical endpoint examined was Radiation Pneumonitis. Each model was represented by a certain dose-response range when the selected parameter sets were applied. Comparing the models with their ranges, a large area of coincidence was revealed. If the parameter uncertainties (standard deviation) are included in the models, their area of coincidence might be enlarged, constraining even greater their predictive ability. The selection of the proper radiobiological parameter set for a given clinical endpoint is crucial. Published parameter values are not definite but should be accompanied by uncertainties, and one should be very careful when applying them to the NTCP models. Correct selection and proper implementation of published parameters provides a quite accurate fit of the NTCP models to the considered endpoint.

  10. Supervised variational model with statistical inference and its application in medical image segmentation.

    PubMed

    Li, Changyang; Wang, Xiuying; Eberl, Stefan; Fulham, Michael; Yin, Yong; Dagan Feng, David

    2015-01-01

    Automated and general medical image segmentation can be challenging because the foreground and the background may have complicated and overlapping density distributions in medical imaging. Conventional region-based level set algorithms often assume piecewise constant or piecewise smooth for segments, which are implausible for general medical image segmentation. Furthermore, low contrast and noise make identification of the boundaries between foreground and background difficult for edge-based level set algorithms. Thus, to address these problems, we suggest a supervised variational level set segmentation model to harness the statistical region energy functional with a weighted probability approximation. Our approach models the region density distributions by using the mixture-of-mixtures Gaussian model to better approximate real intensity distributions and distinguish statistical intensity differences between foreground and background. The region-based statistical model in our algorithm can intuitively provide better performance on noisy images. We constructed a weighted probability map on graphs to incorporate spatial indications from user input with a contextual constraint based on the minimization of contextual graphs energy functional. We measured the performance of our approach on ten noisy synthetic images and 58 medical datasets with heterogeneous intensities and ill-defined boundaries and compared our technique to the Chan-Vese region-based level set model, the geodesic active contour model with distance regularization, and the random walker model. Our method consistently achieved the highest Dice similarity coefficient when compared to the other methods.

  11. Effectiveness of a medical education intervention to treat hypertension in primary care.

    PubMed

    Martínez-Valverde, Silvia; Castro-Ríos, Angélica; Pérez-Cuevas, Ricardo; Klunder-Klunder, Miguel; Salinas-Escudero, Guillermo; Reyes-Morales, Hortensia

    2012-04-01

    In Mexico, hypertension is among the top five causes for visits to primary care clinics; its complications are among the main causes of emergency and hospital care. The present study reports the effectiveness of a continuing medical education (CME) intervention to improve appropriate care for hypertension, on blood pressure control of hypertensive patients in primary care clinics. A secondary data analysis was carried out using data of hypertensive patients treated by family doctors who participated in the CME intervention. The evaluation was designed as a pre-/post-intervention study with control group in six primary care clinics. The effect of the CME intervention was analysed using multiple logistic regression modelling in which the dependent variable was uncontrolled blood pressure in the post-intervention patient measurement. After the CME intervention, the net reduction of uncontrolled blood pressure between stages in the intervention group was 10.3%. The model results were that being treated by a family doctor who participated in the CME intervention reduced by 53% the probability of lack of control of blood pressure; receiving dietary recommendations reduced 57% the probability of uncontrolled blood pressure. Having uncontrolled blood pressure at the baseline stage increased the probability of lack of control in 166%, and per each unit of increase in body mass index the lack of control increased 7%. CME intervention improved the medical decision-making process to manage hypertension, thus increasing the probability of hypertensive patients to have blood pressure under control. © 2010 Blackwell Publishing Ltd.

  12. Is cardiac toxicity a relevant issue in the radiation treatment of esophageal cancer?

    PubMed

    Beukema, Jannet C; van Luijk, Peter; Widder, Joachim; Langendijk, Johannes A; Muijs, Christina T

    2015-01-01

    In recent years several papers have been published on radiation-induced cardiac toxicity, especially in breast cancer patients. However, in esophageal cancer patients the radiation dose to the heart is usually markedly higher. To determine whether radiation-induced cardiac toxicity is also a relevant issue for this group, we conducted a review of the current literature. A literature search was performed in Medline for papers concerning cardiac toxicity in esophageal cancer patients treated with radiotherapy with or without chemotherapy. The overall crude incidence of symptomatic cardiac toxicity was as high as 10.8%. Toxicities corresponded with several dose-volume parameters of the heart. The most frequently reported complications were pericardial effusion, ischemic heart disease and heart failure. Cardiac toxicity is a relevant issue in the treatment of esophageal cancer. However, valid Normal Tissue Complication Probability models for esophageal cancer are not available at present. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Influence of different dose calculation algorithms on the estimate of NTCP for lung complications

    PubMed Central

    Bäck, Anna

    2013-01-01

    Due to limitations and uncertainties in dose calculation algorithms, different algorithms can predict different dose distributions and dose‐volume histograms for the same treatment. This can be a problem when estimating the normal tissue complication probability (NTCP) for patient‐specific dose distributions. Published NTCP model parameters are often derived for a different dose calculation algorithm than the one used to calculate the actual dose distribution. The use of algorithm‐specific NTCP model parameters can prevent errors caused by differences in dose calculation algorithms. The objective of this work was to determine how to change the NTCP model parameters for lung complications derived for a simple correction‐based pencil beam dose calculation algorithm, in order to make them valid for three other common dose calculation algorithms. NTCP was calculated with the relative seriality (RS) and Lyman‐Kutcher‐Burman (LKB) models. The four dose calculation algorithms used were the pencil beam (PB) and collapsed cone (CC) algorithms employed by Oncentra, and the pencil beam convolution (PBC) and anisotropic analytical algorithm (AAA) employed by Eclipse. Original model parameters for lung complications were taken from four published studies on different grades of pneumonitis, and new algorithm‐specific NTCP model parameters were determined. The difference between original and new model parameters was presented in relation to the reported model parameter uncertainties. Three different types of treatments were considered in the study: tangential and locoregional breast cancer treatment and lung cancer treatment. Changing the algorithm without the derivation of new model parameters caused changes in the NTCP value of up to 10 percentage points for the cases studied. Furthermore, the error introduced could be of the same magnitude as the confidence intervals of the calculated NTCP values. The new NTCP model parameters were tabulated as the algorithm was varied from PB to PBC, AAA, or CC. Moving from the PB to the PBC algorithm did not require new model parameters; however, moving from PB to AAA or CC did require a change in the NTCP model parameters, with CC requiring the largest change. It was shown that the new model parameters for a given algorithm are different for the different treatment types. PACS numbers: 87.53.‐j, 87.53.Kn, 87.55.‐x, 87.55.dh, 87.55.kd PMID:24036865

  14. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakobi, Annika, E-mail: Annika.Jakobi@OncoRay.de; Bandurska-Luque, Anna; Department of Radiation Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Dresden

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based onmore » primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.« less

  15. Seeking parsimony in hydrology and water resources technology

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, D.

    2009-04-01

    The principle of parsimony, also known as the principle of simplicity, the principle of economy and Ockham's razor, advises scientists to prefer the simplest theory among those that fit the data equally well. In this, it is an epistemic principle but reflects an ontological characterization that the universe is ultimately parsimonious. Is this principle useful and can it really be reconciled with, and implemented to, our modelling approaches of complex hydrological systems, whose elements and events are extraordinarily numerous, different and unique? The answer underlying the mainstream hydrological research of the last two decades seems to be negative. Hopes were invested to the power of computers that would enable faithful and detailed representation of the diverse system elements and the hydrological processes, based on merely "first principles" and resulting in "physically-based" models that tend to approach in complexity the real world systems. Today the account of such research endeavour seems not positive, as it did not improve model predictive capacity and processes comprehension. A return to parsimonious modelling seems to be again the promising route. The experience from recent research and from comparisons of parsimonious and complicated models indicates that the former can facilitate insight and comprehension, improve accuracy and predictive capacity, and increase efficiency. In addition - and despite aspiration that "physically based" models will have lower data requirements and, even, they ultimately become "data-free" - parsimonious models require fewer data to achieve the same accuracy with more complicated models. Naturally, the concepts that reconcile the simplicity of parsimonious models with the complexity of hydrological systems are probability theory and statistics. Probability theory provides the theoretical basis for moving from a microscopic to a macroscopic view of phenomena, by mapping sets of diverse elements and events of hydrological systems to single numbers (a probability or an expected value), and statistics provides the empirical basis of summarizing data, making inference from them, and supporting decision making in water resource management. Unfortunately, the current state of the art in probability, statistics and their union, often called stochastics, is not fully satisfactory for the needs of modelling of hydrological and water resource systems. A first problem is that stochastic modelling has traditionally relied on classical statistics, which is based on the independent "coin-tossing" prototype, rather than on the study of real-world systems whose behaviour is very different from the classical prototype. A second problem is that the stochastic models (particularly the multivariate ones) are often not parsimonious themselves. Therefore, substantial advancement of stochastics is necessary in a new paradigm of parsimonious hydrological modelling. These ideas are illustrated using several examples, namely: (a) hydrological modelling of a karst system in Bosnia and Herzegovina using three different approaches ranging from parsimonious to detailed "physically-based"; (b) parsimonious modelling of a peculiar modified catchment in Greece; (c) a stochastic approach that can replace parameter-excessive ARMA-type models with a generalized algorithm that produces any shape of autocorrelation function (consistent with the accuracy provided by the data) using a couple of parameters; (d) a multivariate stochastic approach which replaces a huge number of parameters estimated from data with coefficients estimated by the principle of maximum entropy; and (e) a parsimonious approach for decision making in multi-reservoir systems using a handful of parameters instead of thousands of decision variables.

  16. The estimation of lower refractivity uncertainty from radar sea clutter using the Bayesian—MCMC method

    NASA Astrophysics Data System (ADS)

    Sheng, Zheng

    2013-02-01

    The estimation of lower atmospheric refractivity from radar sea clutter (RFC) is a complicated nonlinear optimization problem. This paper deals with the RFC problem in a Bayesian framework. It uses the unbiased Markov Chain Monte Carlo (MCMC) sampling technique, which can provide accurate posterior probability distributions of the estimated refractivity parameters by using an electromagnetic split-step fast Fourier transform terrain parabolic equation propagation model within a Bayesian inversion framework. In contrast to the global optimization algorithm, the Bayesian—MCMC can obtain not only the approximate solutions, but also the probability distributions of the solutions, that is, uncertainty analyses of solutions. The Bayesian—MCMC algorithm is implemented on the simulation radar sea-clutter data and the real radar sea-clutter data. Reference data are assumed to be simulation data and refractivity profiles are obtained using a helicopter. The inversion algorithm is assessed (i) by comparing the estimated refractivity profiles from the assumed simulation and the helicopter sounding data; (ii) the one-dimensional (1D) and two-dimensional (2D) posterior probability distribution of solutions.

  17. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  18. Predicting redox conditions in groundwater at a regional scale

    USGS Publications Warehouse

    Tesoriero, Anthony J.; Terziotti, Silvia; Abrams, Daniel B.

    2015-01-01

    Defining the oxic-suboxic interface is often critical for determining pathways for nitrate transport in groundwater and to streams at the local scale. Defining this interface on a regional scale is complicated by the spatial variability of reaction rates. The probability of oxic groundwater in the Chesapeake Bay watershed was predicted by relating dissolved O2 concentrations in groundwater samples to indicators of residence time and/or electron donor availability using logistic regression. Variables that describe surficial geology, position in the flow system, and soil drainage were important predictors of oxic water. The probability of encountering oxic groundwater at a 30 m depth and the depth to the bottom of the oxic layer were predicted for the Chesapeake Bay watershed. The influence of depth to the bottom of the oxic layer on stream nitrate concentrations and time lags (i.e., time period between land application of nitrogen and its effect on streams) are illustrated using model simulations for hypothetical basins. Regional maps of the probability of oxic groundwater should prove useful as indicators of groundwater susceptibility and stream susceptibility to contaminant sources derived from groundwater.

  19. An introduction of a new stochastic tropical cyclone model for Japan area

    NASA Astrophysics Data System (ADS)

    Suzuki, K.; Nakano, S.; Ueno, G.; Mori, N.; Nakajo, S.

    2015-12-01

    The extreme events such as tropical cyclones (TC), downpours, floods, and so on, have huge influences on the human life in the past, present, and future. In particular, the change in their risks on the human life under the future climate has been concerned by the governments and researchers. Our aim is to estimate the probabilities for frequencies of TC which could attack to Japan under the future climate that calculated by GCMs. For carrying out this subject, it is needed a suitable rare event sampling method to find TCs that land on big cities in Japan. Moreover, it requires sufficient reproductions of TCs for calculation of their probabilities, too. The model for TC reproductions is designed with three parts following the lifecycle of TC; formation, maturity and decay. However, we don't treat the part of maturity with physical equations because the maturity process is complicated to express as a stochastic model. The TC intensity model will take the place of this physical part. Several stochastic TC models have been developed for different purposes and problems. Our model is developed for the establishment of a rare event sampling method. Here, the comparisons of behaviors of TC tracks among several stochastic TC models will be discussed using Best Track data provided by Japan Meteorological Agency and MRI-AGCM data for the present climate.

  20. Advanced reliability methods for structural evaluation

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Wu, Y.-T.

    1985-01-01

    Fast probability integration (FPI) methods, which can yield approximate solutions to such general structural reliability problems as the computation of the probabilities of complicated functions of random variables, are known to require one-tenth the computer time of Monte Carlo methods for a probability level of 0.001; lower probabilities yield even more dramatic differences. A strategy is presented in which a computer routine is run k times with selected perturbed values of the variables to obtain k solutions for a response variable Y. An approximating polynomial is fit to the k 'data' sets, and FPI methods are employed for this explicit form.

  1. The effect of 6 and 15 MV on intensity-modulated radiation therapy prostate cancer treatment: plan evaluation, tumour control probability and normal tissue complication probability analysis, and the theoretical risk of secondary induced malignancies

    PubMed Central

    Hussein, M; Aldridge, S; Guerrero Urbano, T; Nisbet, A

    2012-01-01

    Objective The aim of this study was to investigate the effect of 6 and 15-MV photon energies on intensity-modulated radiation therapy (IMRT) prostate cancer treatment plan outcome and to compare the theoretical risks of secondary induced malignancies. Methods Separate prostate cancer IMRT plans were prepared for 6 and 15-MV beams. Organ-equivalent doses were obtained through thermoluminescent dosemeter measurements in an anthropomorphic Aldersen radiation therapy human phantom. The neutron dose contribution at 15 MV was measured using polyallyl-diglycol-carbonate neutron track etch detectors. Risk coefficients from the International Commission on Radiological Protection Report 103 were used to compare the risk of fatal secondary induced malignancies in out-of-field organs and tissues for 6 and 15 MV. For the bladder and the rectum, a comparative evaluation of the risk using three separate models was carried out. Dose–volume parameters for the rectum, bladder and prostate planning target volume were evaluated, as well as normal tissue complication probability (NTCP) and tumour control probability calculations. Results There is a small increased theoretical risk of developing a fatal cancer from 6 MV compared with 15 MV, taking into account all the organs. Dose–volume parameters for the rectum and bladder show that 15 MV results in better volume sparing in the regions below 70 Gy, but the volume exposed increases slightly beyond this in comparison with 6 MV, resulting in a higher NTCP for the rectum of 3.6% vs 3.0% (p=0.166). Conclusion The choice to treat using IMRT at 15 MV should not be excluded, but should be based on risk vs benefit while considering the age and life expectancy of the patient together with the relative risk of radiation-induced cancer and NTCPs. PMID:22010028

  2. The role of IL-1 gene polymorphisms (IL1A, IL1B, and IL1RN) as a risk factor in unsuccessful implants retaining overdentures.

    PubMed

    Sampaio Fernandes, Margarida; Vaz, Paula; Braga, Ana Cristina; Sampaio Fernandes, João Carlos; Figueiral, Maria Helena

    2017-10-01

    Implant-supported overdentures are an alternative predictable rehabilitation method that has a high impact on improving the patient's quality of life. However, some biological complications may interfere with the maintenance and survival of these overdenture implants. The goal of this article was to assess the factors that affect peri-implant success, through a hypothetical prediction model for biological complications of implant overdentures. A retrospective observational, prevalence study was conducted in 58 edentulous Caucasian patients rehabilitated with implant overdentures. A total of 229 implants were included in the study. Anamnestic, clinical, and implant-related parameters were collected and recorded in a single database. "Patient" was chosen as the unit of analysis, and a complete screening protocol was established. The data analytical study included assessing the odds ratio, concerning the presence or absence of a particular risk factor, by using binary logistic regression modeling. Probability values (p values) inferior to 0.05 were considered as representing statistically significant evidence. The performed prediction model included the following variables: mean probing depth, metal exposure, IL1B_allele2, maxillary edentulousness, and Fusobacterium nucleatum. The F. nucleatum showed significant association with the outcome. Introducing a negative coefficient appeared to prevent complications or even boost the biological defense when associated with other factors. The prediction model developed in this study could serve as a basis for further improved models that would assist clinicians in the daily diagnosis and treatment planning practice of oral rehabilitation with implant overdentures. Copyright © 2017 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  3. Risk Matrix for Prediction of Disease Progression in a Referral Cohort of Patients with Crohn's Disease.

    PubMed

    Lakatos, Peter L; Sipeki, Nora; Kovacs, Gyorgy; Palyu, Eszter; Norman, Gary L; Shums, Zakera; Golovics, Petra A; Lovasz, Barbara D; Antal-Szalmas, Peter; Papp, Maria

    2015-10-01

    Early identification of patients with Crohn's disease (CD) at risk of subsequent complications is essential for adapting the treatment strategy. We aimed to develop a prediction model including clinical and serological markers for assessing the probability of developing advanced disease in a prospective referral CD cohort. Two hundred and seventy-one consecutive CD patients (42.4% males, median follow-up 108 months) were included and followed up prospectively. Anti-Saccharomyces cerevisiae antibodies (ASCA IgA/IgG) were determined by enzyme-linked immunosorbent assay. The final analysis was limited to patients with inflammatory disease behaviour at diagnosis. The final definition of advanced disease outcome was having intestinal resection or disease behaviour progression. Antibody (ASCA IgA and/or IgG) status, disease location and need for early azathioprine were included in a 3-, 5- and 7-year prediction matrix. The probability of advanced disease after 5 years varied from 6.2 to 55% depending on the combination of predictors. Similar findings were obtained in Kaplan-Meier analysis; the combination of ASCA, location and early use of azathioprine was associated with the probability of developing advanced disease (p < 0.001, log rank test). Our prediction models identified substantial differences in the probability of developing advanced disease in the early disease course of CD. Markers identified in this referral cohort were different from those previously published in a population-based cohort, suggesting that different prediction models should be used in the referral setting. Copyright © 2015 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. A Procedure for Deriving Formulas to Convert Transition Rates to Probabilities for Multistate Markov Models.

    PubMed

    Jones, Edmund; Epstein, David; García-Mochón, Leticia

    2017-10-01

    For health-economic analyses that use multistate Markov models, it is often necessary to convert from transition rates to transition probabilities, and for probabilistic sensitivity analysis and other purposes it is useful to have explicit algebraic formulas for these conversions, to avoid having to resort to numerical methods. However, if there are four or more states then the formulas can be extremely complicated. These calculations can be made using packages such as R, but many analysts and other stakeholders still prefer to use spreadsheets for these decision models. We describe a procedure for deriving formulas that use intermediate variables so that each individual formula is reasonably simple. Once the formulas have been derived, the calculations can be performed in Excel or similar software. The procedure is illustrated by several examples and we discuss how to use a computer algebra system to assist with it. The procedure works in a wide variety of scenarios but cannot be employed when there are several backward transitions and the characteristic equation has no algebraic solution, or when the eigenvalues of the transition rate matrix are very close to each other.

  5. Elephantiasis Nostras Verrucosa (ENV): a complication of congestive heart failure and obesity.

    PubMed

    Baird, Drew; Bode, David; Akers, Troy; Deyoung, Zachariah

    2010-01-01

    Congestive heart failure (CHF) and obesity are common medical conditions that have many complications and an increasing incidence in the United States. Presented here is a case of a disfiguring skin condition that visually highlights the dermatologic consequences of poorly controlled CHF and obesity. This condition will probably become more common as CHF and obesity increase in the US.

  6. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houweling, Antonetta C., E-mail: A.Houweling@umcutrecht.n; Philippens, Marielle E.P.; Dijkema, Tim

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulatedmore » salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.« less

  7. [Obesity and heart].

    PubMed

    Svačina, Štěpán

    2014-12-01

    Cardiovascular complications of obesity are traditionally considered an important complication of obesity. Obesity itself is probably not direct cause of atherosclerosis or coronary heart disease. This may occur indirectly in metabolic complications of obesity, especially diabetes and metabolic syndrome. However, thrombogenicity potential of obesity contributes to embolism and atherosclerosis development. In cardiology is well-known a phenomenon of obesity paradox when obese patients have better prognosis than thin. This is the case of heart failure and some other cardiovascular diseases. Recently, a new concept has emerged of myokines - hormones from muscle tissue that have extensive protective effects on organism and probably on heart. Whether heart is a source of myokines is uncertain. However, undoubted importance has epicardial and pericardial fatty tissue. The epicardial fatty tissue has mainly protective effects on myocardium. This fatty tissue may produce factors of inflammation affecting the myocardium. Relationship between amount of epicardial fatty tissue and coronary heart disease is rather pathogenic. Currently, it is certain that obesity brings more metabolic and cancer complications than cardiovascular and accurate contribution to pathogenic or protective character of fatty tissue in cardiology requires further research. Nevertheless, the conclusion is that adipose tissue of organism and around the heart may be in some circumstances beneficial.

  8. A comparison of free autologous breast reconstruction with and without the use of laser-assisted indocyanine green angiography: a cost-effectiveness analysis.

    PubMed

    Chatterjee, Abhishek; Krishnan, Naveen M; Van Vliet, Michael M; Powell, Stephen G; Rosen, Joseph M; Ridgway, Emily B

    2013-05-01

    Laser-assisted indocyanine green angiography is a U.S. Food and Drug Administration-approved technology used to assess tissue viability and perfusion. Its use in plastic and reconstructive surgery to assess flap perfusion in autologous breast reconstruction is relatively new. There have been no previous studies evaluating the cost-effectiveness of this new technology compared with the current practice of clinical judgment in evaluating tissue perfusion and viability in free autologous breast reconstruction in patients who have undergone mastectomy. A comprehensive literature review was performed to identify the complication rate of the most common complications with and without laser-assisted indocyanine green angiography in free autologous breast reconstruction after mastectomy. These probabilities were combined with Medicare Current Procedural Terminology provider reimbursement codes (cost) and utility estimates for common complications from a survey of 10 plastic surgeons to fit into a decision model to evaluate the cost-effectiveness of laser-assisted indocyanine green angiography. The decision model revealed a baseline cost difference of $773.66 and a 0.22 difference in the quality-adjusted life-years, yielding an incremental cost-utility ratio of $3516.64 per quality-adjusted life year favoring laser-assisted indocyanine green angiography. Sensitivity analysis showed that using laser-assisted indocyanine green angiography was more cost-effective when the complication rate without using laser-assisted indocyanine green angiography (clinical judgment alone) was 4 percent or higher. The authors' study demonstrates that laser-assisted indocyanine green angiography is a cost-effective technology under the most stringent acceptable thresholds when used in immediate free autologous breast reconstruction.

  9. Esophageal Dose Tolerance in Patients Treated With Stereotactic Body Radiation Therapy.

    PubMed

    Nuyttens, Joost J; Moiseenko, Vitali; McLaughlin, Mark; Jain, Sheena; Herbert, Scott; Grimm, Jimm

    2016-04-01

    Mediastinal critical structures such as trachea, bronchus, esophagus, and heart are among the dose-limiting factors for stereotactic body radiation therapy (SBRT) to central lung lesions. The purpose of this study was to characterize the risk of esophagitis for patients treated with SBRT and to develop a statistical dose-response model to assess the equivalent uniform dose, D10%, D5cc, D1cc, and Dmax, to the esophagus and the risk of toxicity. Toxicity outcomes of a dose-escalation study of 56 patients who had taken CyberKnife treatment from 45-60Gy in 3-7 fractions at the Erasmus MC-Daniel den Hoed Cancer Center were utilized to create the dose-response model for esophagus. A total of 5 grade 2 esophageal complications were reported (Common Terminology Criteria for Adverse Events version 3.0); 4 complications were early effects and 1 complication was a late effect. All analyses were performed in terms of 5-fraction equivalent dosing. According to our study, D1cc at a dose of 32.9Gy and Dmax dose of 43.4Gy corresponded to a complication probability of 50% for grade 2 toxicity. In this series of 58 CyberKnife mediastinal lung cases, no grade 3 or higher esophageal toxicity occurred. Our estimates of esophageal toxicity are compared with the data in the literature. Further research needs to be performed to establish more reliable dose limits as longer follow-up and toxicity outcomes are reported in patients treated with SBRT for central lung lesions. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. [Clinical evaluation of heavy-particle radiotherapy using dose volume histogram (DVH)].

    PubMed

    Terahara, A; Nakano, T; Tsujii, H

    1998-01-01

    Radiotherapy with heavy particles such as proton and heavy-charged particles is a promising modality for treatment of localized malignant tumors because of the good dose distribution. A dose calculation and radiotherapy planning system which is essential for this kind of treatment has been developed in recent years. It has the capability to compute the dose volume histogram (DVH) which contains dose-volume information for the target volume and other interesting volumes. Recently, DVH is commonly used to evaluate and compare dose distributions in radiotherapy with both photon and heavy particles, and it shows that a superior dose distribution is obtained in heavy particle radiotherapy. DVH is also utilized for the evaluation of dose distribution related to clinical outcomes. Besides models such as normal tissue complication probability (NTCP) and tumor control probability (TCP), which can be calculated from DVH are proposed by several authors, they are applied to evaluate dose distributions themselves and to evaluate them in relation to clinical results. DVH is now a useful and important tool, but further studies are needed to use DVH and these models practically for clinical evaluation of heavy-particle radiotherapy.

  11. EDNA: Expert fault digraph analysis using CLIPS

    NASA Technical Reports Server (NTRS)

    Dixit, Vishweshwar V.

    1990-01-01

    Traditionally fault models are represented by trees. Recently, digraph models have been proposed (Sack). Digraph models closely imitate the real system dependencies and hence are easy to develop, validate and maintain. However, they can also contain directed cycles and analysis algorithms are hard to find. Available algorithms tend to be complicated and slow. On the other hand, the tree analysis (VGRH, Tayl) is well understood and rooted in vast research effort and analytical techniques. The tree analysis algorithms are sophisticated and orders of magnitude faster. Transformation of a digraph (cyclic) into trees (CLP, LP) is a viable approach to blend the advantages of the representations. Neither the digraphs nor the trees provide the ability to handle heuristic knowledge. An expert system, to capture the engineering knowledge, is essential. We propose an approach here, namely, expert network analysis. We combine the digraph representation and tree algorithms. The models are augmented by probabilistic and heuristic knowledge. CLIPS, an expert system shell from NASA-JSC will be used to develop a tool. The technique provides the ability to handle probabilities and heuristic knowledge. Mixed analysis, some nodes with probabilities, is possible. The tool provides graphics interface for input, query, and update. With the combined approach it is expected to be a valuable tool in the design process as well in the capture of final design knowledge.

  12. Improvements in implant dentistry over the last decade: comparison of survival and complication rates in older and newer publications.

    PubMed

    Pjetursson, Bjarni E; Asgeirsson, Asgeir G; Zwahlen, Marcel; Sailer, Irena

    2014-01-01

    The objective of this systematic review was to assess and compare the survival and complication rates of implant-supported prostheses reported in studies published in the year 2000 and before, to those reported in studies published after the year 2000. Three electronic searches complemented by manual searching were conducted to identify 139 prospective and retrospective studies on implant-supported prostheses. The included studies were divided in two groups: a group of 31 older studies published in the year 2000 or before, and a group of 108 newer studies published after the year 2000. Survival and complication rates were calculated using Poisson regression models, and multivariable robust Poisson regression was used to formally compare the outcomes of older and newer studies. The 5-year survival rate of implant-supported prostheses was significantly increased in newer studies compared with older studies. The overall survival rate increased from 93.5% to 97.1%. The survival rate for cemented prostheses increased from 95.2% to 97.9%; for screw-retained reconstruction, from 77.6% to 96.8%; for implant-supported single crowns, from 92.6% to 97.2%; and for implant-supported fixed dental prostheses (FDPs), from 93.5% to 96.4%. The incidence of esthetic complications decreased in more recent studies compared with older ones, but the incidence of biologic complications was similar. The results for technical complications were inconsistent. There was a significant reduction in abutment or screw loosening by implant-supported FDPs. On the other hand, the total number of technical complications and the incidence of fracture of the veneering material was significantly increased in the newer studies. To explain the increased rate of complications, minor complications are probably reported in more detail in the newer publications. The results of the present systematic review demonstrated a positive learning curve in implant dentistry, represented in higher survival rates and lower complication rates reported in more recent clinical studies. The incidence of esthetic, biologic, and technical complications, however, is still high. Hence, it is important to identify these complications and their etiology to make implant treatment even more predictable in the future.

  13. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Boughalia, A; Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-06-01

    The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy-oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose-volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans before treating the patients.

  14. Smoking, Labor, & Delivery: It's Complicated

    Cancer.gov

    You probably have mixed feelings about going into labor. On one hand, bringing a new life into the world is really exciting. On the other, it can be really scary to have a baby, especially if this is your first child. Unfortunately, it can be even scarier if you smoke. Research shows that smoking during pregnancy can lead to serious complications for you and your baby during labor and delivery.

  15. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dean, Jamie A., E-mail: jamie.dean@icr.ac.uk; Wong, Kee H.; Gay, Hiram

    Purpose: Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue–sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. Methods and Materials: FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogrammore » data. The reduced dose data were input into functional logistic regression models (functional partial least squares–logistic regression [FPLS-LR] and functional principal component–logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate–response associations, assessed using bootstrapping. Results: The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/−0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/−0.96, 0.79/−0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. Conclusions: FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling.« less

  16. Functional Data Analysis Applied to Modeling of Severe Acute Mucositis and Dysphagia Resulting From Head and Neck Radiation Therapy.

    PubMed

    Dean, Jamie A; Wong, Kee H; Gay, Hiram; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Oh, Jung Hun; Apte, Aditya; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Deasy, Joseph O; Nutting, Christopher M; Gulliford, Sarah L

    2016-11-15

    Current normal tissue complication probability modeling using logistic regression suffers from bias and high uncertainty in the presence of highly correlated radiation therapy (RT) dose data. This hinders robust estimates of dose-response associations and, hence, optimal normal tissue-sparing strategies from being elucidated. Using functional data analysis (FDA) to reduce the dimensionality of the dose data could overcome this limitation. FDA was applied to modeling of severe acute mucositis and dysphagia resulting from head and neck RT. Functional partial least squares regression (FPLS) and functional principal component analysis were used for dimensionality reduction of the dose-volume histogram data. The reduced dose data were input into functional logistic regression models (functional partial least squares-logistic regression [FPLS-LR] and functional principal component-logistic regression [FPC-LR]) along with clinical data. This approach was compared with penalized logistic regression (PLR) in terms of predictive performance and the significance of treatment covariate-response associations, assessed using bootstrapping. The area under the receiver operating characteristic curve for the PLR, FPC-LR, and FPLS-LR models was 0.65, 0.69, and 0.67, respectively, for mucositis (internal validation) and 0.81, 0.83, and 0.83, respectively, for dysphagia (external validation). The calibration slopes/intercepts for the PLR, FPC-LR, and FPLS-LR models were 1.6/-0.67, 0.45/0.47, and 0.40/0.49, respectively, for mucositis (internal validation) and 2.5/-0.96, 0.79/-0.04, and 0.79/0.00, respectively, for dysphagia (external validation). The bootstrapped odds ratios indicated significant associations between RT dose and severe toxicity in the mucositis and dysphagia FDA models. Cisplatin was significantly associated with severe dysphagia in the FDA models. None of the covariates was significantly associated with severe toxicity in the PLR models. Dose levels greater than approximately 1.0 Gy/fraction were most strongly associated with severe acute mucositis and dysphagia in the FDA models. FPLS and functional principal component analysis marginally improved predictive performance compared with PLR and provided robust dose-response associations. FDA is recommended for use in normal tissue complication probability modeling. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. Monte Carlo role in radiobiological modelling of radiotherapy outcomes

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Pater, Piotr; Seuntjens, Jan

    2012-06-01

    Radiobiological models are essential components of modern radiotherapy. They are increasingly applied to optimize and evaluate the quality of different treatment planning modalities. They are frequently used in designing new radiotherapy clinical trials by estimating the expected therapeutic ratio of new protocols. In radiobiology, the therapeutic ratio is estimated from the expected gain in tumour control probability (TCP) to the risk of normal tissue complication probability (NTCP). However, estimates of TCP/NTCP are currently based on the deterministic and simplistic linear-quadratic formalism with limited prediction power when applied prospectively. Given the complex and stochastic nature of the physical, chemical and biological interactions associated with spatial and temporal radiation induced effects in living tissues, it is conjectured that methods based on Monte Carlo (MC) analysis may provide better estimates of TCP/NTCP for radiotherapy treatment planning and trial design. Indeed, over the past few decades, methods based on MC have demonstrated superior performance for accurate simulation of radiation transport, tumour growth and particle track structures; however, successful application of modelling radiobiological response and outcomes in radiotherapy is still hampered with several challenges. In this review, we provide an overview of some of the main techniques used in radiobiological modelling for radiotherapy, with focus on the MC role as a promising computational vehicle. We highlight the current challenges, issues and future potentials of the MC approach towards a comprehensive systems-based framework in radiobiological modelling for radiotherapy.

  18. Radiobiological modeling of two stereotactic body radiotherapy schedules in patients with stage I peripheral non-small cell lung cancer.

    PubMed

    Huang, Bao-Tian; Lin, Zhu; Lin, Pei-Xian; Lu, Jia-Yang; Chen, Chuang-Zhen

    2016-06-28

    This study aims to compare the radiobiological response of two stereotactic body radiotherapy (SBRT) schedules for patients with stage I peripheral non-small cell lung cancer (NSCLC) using radiobiological modeling methods. Volumetric modulated arc therapy (VMAT)-based SBRT plans were designed using two dose schedules of 1 × 34 Gy (34 Gy in 1 fraction) and 4 × 12 Gy (48 Gy in 4 fractions) for 19 patients diagnosed with primary stage I NSCLC. Dose to the gross target volume (GTV), planning target volume (PTV), lung and chest wall (CW) were converted to biologically equivalent dose in 2 Gy fraction (EQD2) for comparison. Five different radiobiological models were employed to predict the tumor control probability (TCP) value. Three additional models were utilized to estimate the normal tissue complication probability (NTCP) value for the lung and the modified equivalent uniform dose (mEUD) value to the CW. Our result indicates that the 1 × 34 Gy dose schedule provided a higher EQD2 dose to the tumor, lung and CW. Radiobiological modeling revealed that the TCP value for the tumor, NTCP value for the lung and mEUD value for the CW were 7.4% (in absolute value), 7.2% (in absolute value) and 71.8% (in relative value) higher on average, respectively, using the 1 × 34 Gy dose schedule.

  19. Multivariable normal-tissue complication modeling of acute esophageal toxicity in advanced stage non-small cell lung cancer patients treated with intensity-modulated (chemo-)radiotherapy.

    PubMed

    Wijsman, Robin; Dankers, Frank; Troost, Esther G C; Hoffmann, Aswin L; van der Heijden, Erik H F M; de Geus-Oei, Lioe-Fee; Bussink, Johan

    2015-10-01

    The majority of normal-tissue complication probability (NTCP) models for acute esophageal toxicity (AET) in advanced stage non-small cell lung cancer (AS-NSCLC) patients treated with (chemo-)radiotherapy are based on three-dimensional conformal radiotherapy (3D-CRT). Due to distinct dosimetric characteristics of intensity-modulated radiation therapy (IMRT), 3D-CRT based models need revision. We established a multivariable NTCP model for AET in 149 AS-NSCLC patients undergoing IMRT. An established model selection procedure was used to develop an NTCP model for Grade ⩾2 AET (53 patients) including clinical and esophageal dose-volume histogram parameters. The NTCP model predicted an increased risk of Grade ⩾2 AET in case of: concurrent chemoradiotherapy (CCR) [adjusted odds ratio (OR) 14.08, 95% confidence interval (CI) 4.70-42.19; p<0.001], increasing mean esophageal dose [Dmean; OR 1.12 per Gy increase, 95% CI 1.06-1.19; p<0.001], female patients (OR 3.33, 95% CI 1.36-8.17; p=0.008), and ⩾cT3 (OR 2.7, 95% CI 1.12-6.50; p=0.026). The AUC was 0.82 and the model showed good calibration. A multivariable NTCP model including CCR, Dmean, clinical tumor stage and gender predicts Grade ⩾2 AET after IMRT for AS-NSCLC. Prior to clinical introduction, the model needs validation in an independent patient cohort. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Visceroptosis and the Ehlers-Danlos Syndrome.

    PubMed

    Kucera, Stephen; Sullivan, Stephen N

    2017-11-08

    The case of a patient with visceroptosis and Ehlers-Danlos syndrome hypermobility type (RDS-HT) is reported here. The literature on this unusual but probably under-recognized complication is reviewed.

  1. Tracking Expected Improvements of Decadal Prediction in Climate Services

    NASA Astrophysics Data System (ADS)

    Suckling, E.; Thompson, E.; Smith, L. A.

    2013-12-01

    Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.

  2. The comparative risk of developing postoperative complications in patients with distal radius fractures following different treatment modalities

    PubMed Central

    Qiu, Wen-Jun; Li, Yi-Fan; Ji, Yun-Han; Xu, Wei; Zhu, Xiao-Dong; Tang, Xian-Zhong; Zhao, Huan-Li; Wang, Gui-Bin; Jia, Yue-Qing; Zhu, Shi-Cai; Zhang, Feng-Fang; Liu, Hong-Mei

    2015-01-01

    In this study, we performed a network meta-analysis to compare the outcomes of seven most common surgical procedures to fix DRF, including bridging external fixation, non-bridging external fixation, K-wire fixation, plaster fixation, dorsal plating, volar plating, and dorsal and volar plating. Published studies were retrieved through PubMed, Embase and Cochrane Library databases. The database search terms used were the following keywords and MeSH terms: DRF, bridging external fixation, non-bridging external fixation, K-wire fixation, plaster fixation, dorsal plating, volar plating, and dorsal and volar plating. The network meta-analysis was performed to rank the probabilities of postoperative complication risks for the seven surgical modalities in DRF patients. This network meta-analysis included data obtained from a total of 19 RCTs. Our results revealed that compared to DRF patients treated with bridging external fixation, marked differences in pin-track infection (PTI) rate were found in patients treated with plaster fixation, volar plating, and dorsal and volar plating. Cluster analysis showed that plaster fixation is associated with the lowest probability of postoperative complication in DRF patients. Plaster fixation is associated with the lowest risk for postoperative complications in DRF patients, when compared to six other common DRF surgical methods examined. PMID:26549312

  3. [Influence of Training of Orthopaedic Surgeons on Clinical Outcome after Total Hip Arthroplasty in a High Volume Endoprosthetic Centre].

    PubMed

    Osmanski-Zenk, Katrin; Finze, Susanne; Lenz, Robert; Bader, Rainer; Mittelmeier, Wolfram

    2018-06-26

    The study aims to evaluate whether the postoperative outcome and the probability of complications of patients with total hip arthroplasty increases significantly when surgeons in training are in charge, assisted by a high volume surgeon, compared to a highly experienced orthopaedic surgeon, within the context of a high volume hospital certified to EndoCert. 192 patients with a primary hip arthroplasty were included. To assess the outcome, the Harris Hip Score, WOMAC, SF-36 and EuroQol-5D were surveyed pre- and 12 months postoperatively. As complications we considered the quality indicators defined by EndoCert. We found significant improvements in the postoperative score values with the qualifications of the surgeon in charge, even when a high volume surgeon or a surgeon in training was responsible. If a surgeon in training is assisted by a highly experienced surgeon, the risk of complications does not increase, although the operating time was significantly increased. Both the surgeon in training as well as the arthroplasty patient benefit from implementing the EndoCert system, because the postoperative outcome and the complication probability is independent of the qualifcation of the operating orthopaedic surgeon performing total hip arthroplasty when assisted by an experienced surgeon. Georg Thieme Verlag KG Stuttgart · New York.

  4. Biological effects and equivalent doses in radiotherapy: A software solution

    PubMed Central

    Voyant, Cyril; Julian, Daniel; Roustit, Rudy; Biffi, Katia; Lantieri, Céline

    2013-01-01

    Background The limits of TDF (time, dose, and fractionation) and linear quadratic models have been known for a long time. Medical physicists and physicians are required to provide fast and reliable interpretations regarding delivered doses or any future prescriptions relating to treatment changes. Aim We, therefore, propose a calculation interface under the GNU license to be used for equivalent doses, biological doses, and normal tumor complication probability (Lyman model). Materials and methods The methodology used draws from several sources: the linear-quadratic-linear model of Astrahan, the repopulation effects of Dale, and the prediction of multi-fractionated treatments of Thames. Results and conclusions The results are obtained from an algorithm that minimizes an ad-hoc cost function, and then compared to an equivalent dose computed using standard calculators in seven French radiotherapy centers. PMID:24936319

  5. Convex reformulation of biologically-based multi-criteria intensity-modulated radiation therapy optimization including fractionation effects

    NASA Astrophysics Data System (ADS)

    Hoffmann, Aswin L.; den Hertog, Dick; Siem, Alex Y. D.; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2008-11-01

    Finding fluence maps for intensity-modulated radiation therapy (IMRT) can be formulated as a multi-criteria optimization problem for which Pareto optimal treatment plans exist. To account for the dose-per-fraction effect of fractionated IMRT, it is desirable to exploit radiobiological treatment plan evaluation criteria based on the linear-quadratic (LQ) cell survival model as a means to balance the radiation benefits and risks in terms of biologic response. Unfortunately, the LQ-model-based radiobiological criteria are nonconvex functions, which make the optimization problem hard to solve. We apply the framework proposed by Romeijn et al (2004 Phys. Med. Biol. 49 1991-2013) to find transformations of LQ-model-based radiobiological functions and establish conditions under which transformed functions result in equivalent convex criteria that do not change the set of Pareto optimal treatment plans. The functions analysed are: the LQ-Poisson-based model for tumour control probability (TCP) with and without inter-patient heterogeneity in radiation sensitivity, the LQ-Poisson-based relative seriality s-model for normal tissue complication probability (NTCP), the equivalent uniform dose (EUD) under the LQ-Poisson model and the fractionation-corrected Probit-based model for NTCP according to Lyman, Kutcher and Burman. These functions differ from those analysed before in that they cannot be decomposed into elementary EUD or generalized-EUD functions. In addition, we show that applying increasing and concave transformations to the convexified functions is beneficial for the piecewise approximation of the Pareto efficient frontier.

  6. Follow-up of the original cohort with the Ahmed glaucoma valve implant.

    PubMed

    Topouzis, F; Coleman, A L; Choplin, N; Bethlem, M M; Hill, R; Yu, F; Panek, W C; Wilson, M R

    1999-08-01

    To study the long-term results of the Ahmed glaucoma valve implant in patients with complicated glaucoma in whom short-term results have been reported. In this multicenter study, we analyzed the long-term outcome of a cohort of 60 eyes from 60 patients in whom the Ahmed glaucoma valve was implanted. Failure was characterized by at least one of the following: intraocular pressure greater than 21 mm Hg at both of the last two visits less than 6 mm Hg at both of the last two visits, loss of light perception, additional glaucoma surgery, devastating complications, and removal or replacement of the Ahmed glaucoma valve implant. Devastating complications included chronic hypotony, retinal detachment, malignant glaucoma, endophthalmitis, and phthisis bulbi; we also report results that add corneal complications (corneal decompensation or edema, corneal graft failure) as defining a devastating complication. The mean follow-up time for the 60 eyes was 30.5 months (range, 2.1 to 63.5). When corneal complications were included in the definition of failure, 26 eyes (43%) were considered failures. Cumulative probabilities of success at 1, 2, 3, and 4 years were 76%, 68%, 54%, and 45%, respectively. When corneal complications were excluded from the definition of failure, 13 eyes (21.5%) were considered failures. Cumulative probabilities of success at 1, 2, 3, and 4 years were 87%, 82%, 76%, and 76%, respectively. Most of the failures after 12 months of postoperative follow-up were because of corneal complications. The long-term performance of the Ahmed glaucoma valve implant is comparable to other drainage devices. More than 12 months after the implantation of the Ahmed glaucoma valve implant, the most frequent adverse outcome was corneal decompensation or corneal graft failure. These corneal problems may be secondary to the type of eyes that have drainage devices or to the drainage device itself. Further investigation is needed to identify the reasons that corneal problems follow drainage device implantation.

  7. Theoretical Aspects of the Patterns Recognition Statistical Theory Used for Developing the Diagnosis Algorithms for Complicated Technical Systems

    NASA Astrophysics Data System (ADS)

    Obozov, A. A.; Serpik, I. N.; Mihalchenko, G. S.; Fedyaeva, G. A.

    2017-01-01

    In the article, the problem of application of the pattern recognition (a relatively young area of engineering cybernetics) for analysis of complicated technical systems is examined. It is shown that the application of a statistical approach for hard distinguishable situations could be the most effective. The different recognition algorithms are based on Bayes approach, which estimates posteriori probabilities of a certain event and an assumed error. Application of the statistical approach to pattern recognition is possible for solving the problem of technical diagnosis complicated systems and particularly big powered marine diesel engines.

  8. Impact of signal scattering and parametric uncertainties on receiver operating characteristics

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Breton, Daniel J.; Hart, Carl R.; Pettit, Chris L.

    2017-05-01

    The receiver operating characteristic (ROC curve), which is a plot of the probability of detection as a function of the probability of false alarm, plays a key role in the classical analysis of detector performance. However, meaningful characterization of the ROC curve is challenging when practically important complications such as variations in source emissions, environmental impacts on the signal propagation, uncertainties in the sensor response, and multiple sources of interference are considered. In this paper, a relatively simple but realistic model for scattered signals is employed to explore how parametric uncertainties impact the ROC curve. In particular, we show that parametric uncertainties in the mean signal and noise power substantially raise the tails of the distributions; since receiver operation with a very low probability of false alarm and a high probability of detection is normally desired, these tails lead to severely degraded performance. Because full a priori knowledge of such parametric uncertainties is rarely available in practice, analyses must typically be based on a finite sample of environmental states, which only partially characterize the range of parameter variations. We show how this effect can lead to misleading assessments of system performance. For the cases considered, approximately 64 or more statistically independent samples of the uncertain parameters are needed to accurately predict the probabilities of detection and false alarm. A connection is also described between selection of suitable distributions for the uncertain parameters, and Bayesian adaptive methods for inferring the parameters.

  9. An assessment of PTV margin based on actual accumulated dose for prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Wen, Ning; Kumarasiri, Akila; Nurushev, Teamour; Burmeister, Jay; Xing, Lei; Liu, Dezhi; Glide-Hurst, Carri; Kim, Jinkoo; Zhong, Hualiang; Movsas, Benjamin; Chetty, Indrin J.

    2013-11-01

    The purpose of this work is to present the results of a margin reduction study involving dosimetric and radiobiologic assessment of cumulative dose distributions, computed using an image guided adaptive radiotherapy based framework. Eight prostate cancer patients, treated with 7-9, 6 MV, intensity modulated radiation therapy (IMRT) fields, were included in this study. The workflow consists of cone beam CT (CBCT) based localization, deformable image registration of the CBCT to simulation CT image datasets (SIM-CT), dose reconstruction and dose accumulation on the SIM-CT, and plan evaluation using radiobiological models. For each patient, three IMRT plans were generated with different margins applied to the CTV. The PTV margin for the original plan was 10 mm and 6 mm at the prostate/anterior rectal wall interface (10/6 mm) and was reduced to: (a) 5/3 mm, and (b) 3 mm uniformly. The average percent reductions in predicted tumor control probability (TCP) in the accumulated (actual) plans in comparison to the original plans over eight patients were 0.4%, 0.7% and 11.0% with 10/6 mm, 5/3 mm and 3 mm uniform margin respectively. The mean increase in predicted normal tissue complication probability (NTCP) for grades 2/3 rectal bleeding for the actual plans in comparison to the static plans with margins of 10/6, 5/3 and 3 mm uniformly was 3.5%, 2.8% and 2.4% respectively. For the actual dose distributions, predicted NTCP for late rectal bleeding was reduced by 3.6% on average when the margin was reduced from 10/6 mm to 5/3 mm, and further reduced by 1.0% on average when the margin was reduced to 3 mm. The average reduction in complication free tumor control probability (P+) in the actual plans in comparison to the original plans with margins of 10/6, 5/3 and 3 mm was 3.7%, 2.4% and 13.6% correspondingly. The significant reduction of TCP and P+ in the actual plan with 3 mm margin came from one outlier, where individualizing patient treatment plans through margin adaptation based on biological models, might yield higher quality treatments.

  10. Treatment strategies for pelvic organ prolapse: a cost-effectiveness analysis.

    PubMed

    Hullfish, Kathie L; Trowbridge, Elisa R; Stukenborg, George J

    2011-05-01

    To compare the relative cost effectiveness of treatment decision alternatives for post-hysterectomy pelvic organ prolapse (POP). A Markov decision analysis model was used to assess and compare the relative cost effectiveness of expectant management, use of a pessary, and surgery for obtaining months of quality-adjusted life over 1 year. Sensitivity analysis was conducted to determine whether the results depended on specific estimates of patient utilities for pessary use, probabilities for complications and other events, and estimated costs. Only two treatment alternatives were found to be efficient choices: initial pessary use and vaginal reconstructive surgery (VRS). Pessary use (including patients that eventually transitioned to surgery) achieved 10.4 quality-adjusted months, at a cost of $10,000 per patient, while VRS obtained 11.4 quality-adjusted months, at $15,000 per patient. Sensitivity analysis demonstrated that these baseline results depended on several key estimates in the model. This analysis indicates that pessary use and VRS are the most cost-effective treatment alternatives for treating post-hysterectomy vaginal prolapse. Additional research is needed to standardize POP outcomes and complications, so that healthcare providers can best utilize cost information in balancing the risks and benefits of their treatment decisions.

  11. Variation of normal tissue complication probability (NTCP) estimates of radiation-induced hypothyroidism in relation to changes in delineation of the thyroid gland.

    PubMed

    Rønjom, Marianne F; Brink, Carsten; Lorenzen, Ebbe L; Hegedüs, Laszlo; Johansen, Jørgen

    2015-01-01

    To examine the variations of risk-estimates of radiation-induced hypothyroidism (HT) from our previously developed normal tissue complication probability (NTCP) model in patients with head and neck squamous cell carcinoma (HNSCC) in relation to variability of delineation of the thyroid gland. In a previous study for development of an NTCP model for HT, the thyroid gland was delineated in 246 treatment plans of patients with HNSCC. Fifty of these plans were randomly chosen for re-delineation for a study of the intra- and inter-observer variability of thyroid volume, Dmean and estimated risk of HT. Bland-Altman plots were used for assessment of the systematic (mean) and random [standard deviation (SD)] variability of the three parameters, and a method for displaying the spatial variation in delineation differences was developed. Intra-observer variability resulted in a mean difference in thyroid volume and Dmean of 0.4 cm(3) (SD ± 1.6) and -0.5 Gy (SD ± 1.0), respectively, and 0.3 cm(3) (SD ± 1.8) and 0.0 Gy (SD ± 1.3) for inter-observer variability. The corresponding mean differences of NTCP values for radiation-induced HT due to intra- and inter-observer variations were insignificantly small, -0.4% (SD ± 6.0) and -0.7% (SD ± 4.8), respectively, but as the SDs show, for some patients the difference in estimated NTCP was large. For the entire study population, the variation in predicted risk of radiation-induced HT in head and neck cancer was small and our NTCP model was robust against observer variations in delineation of the thyroid gland. However, for the individual patient, there may be large differences in estimated risk which calls for precise delineation of the thyroid gland to obtain correct dose and NTCP estimates for optimized treatment planning in the individual patient.

  12. The discrete Laplace exponential family and estimation of Y-STR haplotype frequencies.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2013-07-21

    Estimating haplotype frequencies is important in e.g. forensic genetics, where the frequencies are needed to calculate the likelihood ratio for the evidential weight of a DNA profile found at a crime scene. Estimation is naturally based on a population model, motivating the investigation of the Fisher-Wright model of evolution for haploid lineage DNA markers. An exponential family (a class of probability distributions that is well understood in probability theory such that inference is easily made by using existing software) called the 'discrete Laplace distribution' is described. We illustrate how well the discrete Laplace distribution approximates a more complicated distribution that arises by investigating the well-known population genetic Fisher-Wright model of evolution by a single-step mutation process. It was shown how the discrete Laplace distribution can be used to estimate haplotype frequencies for haploid lineage DNA markers (such as Y-chromosomal short tandem repeats), which in turn can be used to assess the evidential weight of a DNA profile found at a crime scene. This was done by making inference in a mixture of multivariate, marginally independent, discrete Laplace distributions using the EM algorithm to estimate the probabilities of membership of a set of unobserved subpopulations. The discrete Laplace distribution can be used to estimate haplotype frequencies with lower prediction error than other existing estimators. Furthermore, the calculations could be performed on a normal computer. This method was implemented in the freely available open source software R that is supported on Linux, MacOS and MS Windows. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Idealized models of the joint probability distribution of wind speeds

    NASA Astrophysics Data System (ADS)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  14. Neuromuscular complications of thyrotoxicosis.

    PubMed

    Kung, Annie W C

    2007-11-01

    Thyroid hormones exert multiple effects on the neuromuscular system and the brain, with the most important being their role in stimulating the development and differentiation of the neuromuscular system and brain in foetal and neonatal life. In the presence of hyperthyroidism, muscular and neurological symptoms may be the presenting clinical features of the disease. The frequency and severity of neuromuscular complications vary considerably and are probably related to the degree of hyperthyroidism, although in some patients the neuromuscular dysfunction is caused by associated disorders rather than by hyperthyroidism per se. This update focuses on the most common neurological and muscular disorders that occur in patients with thyrotoxicosis. It is beyond the scope of this paper to discuss thyroid eye disease and cardiac complications, in themselves separate complications of specific myocytes.

  15. Continuous-Time Classical and Quantum Random Walk on Direct Product of Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Salimi, S.; Jafarizadeh, M. A.

    2009-06-01

    In this paper we define direct product of graphs and give a recipe for obtaining probability of observing particle on vertices in the continuous-time classical and quantum random walk. In the recipe, the probability of observing particle on direct product of graph is obtained by multiplication of probability on the corresponding to sub-graphs, where this method is useful to determining probability of walk on complicated graphs. Using this method, we calculate the probability of continuous-time classical and quantum random walks on many of finite direct product Cayley graphs (complete cycle, complete Kn, charter and n-cube). Also, we inquire that the classical state the stationary uniform distribution is reached as t → ∞ but for quantum state is not always satisfied.

  16. Present developments in reaching an international consensus for a model-based approach to particle beam therapy.

    PubMed

    Prayongrat, Anussara; Umegaki, Kikuo; van der Schaaf, Arjen; Koong, Albert C; Lin, Steven H; Whitaker, Thomas; McNutt, Todd; Matsufuji, Naruhiro; Graves, Edward; Mizuta, Masahiko; Ogawa, Kazuhiko; Date, Hiroyuki; Moriwaki, Kensuke; Ito, Yoichi M; Kobashi, Keiji; Dekura, Yasuhiro; Shimizu, Shinichi; Shirato, Hiroki

    2018-03-01

    Particle beam therapy (PBT), including proton and carbon ion therapy, is an emerging innovative treatment for cancer patients. Due to the high cost of and limited access to treatment, meticulous selection of patients who would benefit most from PBT, when compared with standard X-ray therapy (XRT), is necessary. Due to the cost and labor involved in randomized controlled trials, the model-based approach (MBA) is used as an alternative means of establishing scientific evidence in medicine, and it can be improved continuously. Good databases and reasonable models are crucial for the reliability of this approach. The tumor control probability and normal tissue complication probability models are good illustrations of the advantages of PBT, but pre-existing NTCP models have been derived from historical patient treatments from the XRT era. This highlights the necessity of prospectively analyzing specific treatment-related toxicities in order to develop PBT-compatible models. An international consensus has been reached at the Global Institution for Collaborative Research and Education (GI-CoRE) joint symposium, concluding that a systematically developed model is required for model accuracy and performance. Six important steps that need to be observed in these considerations include patient selection, treatment planning, beam delivery, dose verification, response assessment, and data analysis. Advanced technologies in radiotherapy and computer science can be integrated to improve the efficacy of a treatment. Model validation and appropriately defined thresholds in a cost-effectiveness centered manner, together with quality assurance in the treatment planning, have to be achieved prior to clinical implementation.

  17. Bulk plasma fragmentation in a C4F8 inductively coupled plasma: A hybrid modeling study

    NASA Astrophysics Data System (ADS)

    Zhao, Shu-Xia; Zhang, Yu-Ru; Gao, Fei; Wang, You-Nian; Bogaerts, Annemie

    2015-06-01

    A hybrid model is used to investigate the fragmentation of C4F8 inductive discharges. Indeed, the resulting reactive species are crucial for the optimization of the Si-based etching process, since they determine the mechanisms of fluorination, polymerization, and sputtering. In this paper, we present the dissociation degree, the density ratio of F vs. CxFy (i.e., fluorocarbon (fc) neutrals), the neutral vs. positive ion density ratio, details on the neutral and ion components, and fractions of various fc neutrals (or ions) in the total fc neutral (or ion) density in a C4F8 inductively coupled plasma source, as well as the effect of pressure and power on these results. To analyze the fragmentation behavior, the electron density and temperature and electron energy probability function (EEPF) are investigated. Moreover, the main electron-impact generation sources for all considered neutrals and ions are determined from the complicated C4F8 reaction set used in the model. The C4F8 plasma fragmentation is explained, taking into account many factors, such as the EEPF characteristics, the dominance of primary and secondary processes, and the thresholds of dissociation and ionization. The simulation results are compared with experiments from literature, and reasonable agreement is obtained. Some discrepancies are observed, which can probably be attributed to the simplified polymer surface kinetics assumed in the model.

  18. Nonaggressive obstetric management. An option for some fetal anomalies during the third trimester.

    PubMed

    Chervenak, F A; McCullough, L B

    1989-06-16

    Nonaggressive obstetric management was used in 13 cases of anomalous fetuses during the third trimester. Criteria that define these anomalies are (1) a very high probability of a correct diagnosis and (2) either (a) a very high probability of death as an outcome of the anomaly diagnosed or (b) a very high probability of severe and irreversible deficit of cognitive developmental capacity as a result of the anomaly diagnosed. On the basis of two approaches to obstetric ethics, we defend the legitimacy of nonaggressive management of third-trimester pregnancies complicated by fetal anomalies that meet these criteria.

  19. 76 FR 21747 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-18

    ... disease burden and have a high probability of developing diabetes-related complications. However, research... thirteen (the ``Youth Books''). CDC plans to conduct a descriptive evaluation of the Eagle Books and the...

  20. Branch retinal vein thrombosis and visual loss probably associated with pegylated interferon therapy of chronic hepatitis C

    PubMed Central

    Gonçalves, Luciana Lofego; Farias, Alberto Queiroz; Gonçalves, Patrícia Lofego; D’Amico, Elbio Antonio; Carrilho, Flair José

    2006-01-01

    Ophthalmological complications with interferon therapy are usually mild and reversible, not requiring the withdrawal of the treatment. We report a case of a patient who had visual loss probably associated with interferon therapy. Chronic hepatitis C virus infection (genotype 1a) was diagnosed in a 33-year old asymptomatic man. His past medical history was unremarkable and previous routine ophthalmologic check-up was normal. Pegylated interferon alpha and ribavirin were started. Three weeks later he reported painless reduction of vision. Ophthalmologic examination showed extensive intraretinal hemorrhages and cotton-wool spots, associated with inferior branch retinal vein thrombosis. Antiviral therapy was immediately discontinued, but one year later he persists with severely decreased visual acuity. This case illustrates the possibility of unpredictable and severe complications during pegylated interferon therapy. PMID:16874884

  1. The tensor distribution function.

    PubMed

    Leow, A D; Zhu, S; Zhan, L; McMahon, K; de Zubicaray, G I; Meredith, M; Wright, M J; Toga, A W; Thompson, P M

    2009-01-01

    Diffusion weighted magnetic resonance imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of six directions, second-order tensors (represented by three-by-three positive definite matrices) can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve more complicated white matter configurations, e.g., crossing fiber tracts. Recently, a number of high-angular resolution schemes with more than six gradient directions have been employed to address this issue. In this article, we introduce the tensor distribution function (TDF), a probability function defined on the space of symmetric positive definite matrices. Using the calculus of variations, we solve the TDF that optimally describes the observed data. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function. Moreover, a tensor orientation distribution function (TOD) may also be derived from the TDF, allowing for the estimation of principal fiber directions and their corresponding eigenvalues.

  2. Do quantitative decadal forecasts from GCMs provide decision relevant skill?

    NASA Astrophysics Data System (ADS)

    Suckling, E. B.; Smith, L. A.

    2012-04-01

    It is widely held that only physics-based simulation models can capture the dynamics required to provide decision-relevant probabilistic climate predictions. This fact in itself provides no evidence that predictions from today's GCMs are fit for purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales, where it is argued that these 'physics free' forecasts provide a quantitative 'zero skill' target for the evaluation of forecasts based on more complicated models. It is demonstrated that these zero skill models are competitive with GCMs on decadal scales for probability forecasts evaluated over the last 50 years. Complications of statistical interpretation due to the 'hindcast' nature of this experiment, and the likely relevance of arguments that the lack of hindcast skill is irrelevant as the signal will soon 'come out of the noise' are discussed. A lack of decision relevant quantiative skill does not bring the science-based insights of anthropogenic warming into doubt, but it does call for a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to do so may risk the credibility of science in support of policy in the long term. The performance amongst a collection of simulation models is evaluated, having transformed ensembles of point forecasts into probability distributions through the kernel dressing procedure [1], according to a selection of proper skill scores [2] and contrasted with purely data-based empirical models. Data-based models are unlikely to yield realistic forecasts for future climate change if the Earth system moves away from the conditions observed in the past, upon which the models are constructed; in this sense the empirical model defines zero skill. When should a decision relevant simulation model be expected to significantly outperform such empirical models? Probability forecasts up to ten years ahead (decadal forecasts) are considered, both on global and regional spatial scales for surface air temperature. Such decadal forecasts are not only important in terms of providing information on the impacts of near-term climate change, but also from the perspective of climate model validation, as hindcast experiments and a sufficient database of historical observations allow standard forecast verification methods to be used. Simulation models from the ENSEMBLES hindcast experiment [3] are evaluated and contrasted with static forecasts of the observed climatology, persistence forecasts and against simple statistical models, called dynamic climatology (DC). It is argued that DC is a more apropriate benchmark in the case of a non-stationary climate. It is found that the ENSEMBLES models do not demonstrate a significant increase in skill relative to the empirical models even at global scales over any lead time up to a decade ahead. It is suggested that the contsruction and co-evaluation with the data-based models become a regular component of the reporting of large simulation model forecasts. The methodology presented may easily be adapted to other forecasting experiments and is expected to influence the design of future experiments. The inclusion of comparisons with dynamic climatology and other data-based approaches provide important information to both scientists and decision makers on which aspects of state-of-the-art simulation forecasts are likely to be fit for purpose. [1] J. Bröcker and L. A. Smith. From ensemble forecasts to predictive distributions, Tellus A, 60(4), 663-678 (2007). [2] J. Bröcker and L. A. Smith. Scoring probabilistic forecasts: The importance of being proper, Weather and Forecasting, 22, 382-388 (2006). [3] F. J. Doblas-Reyes, A. Weisheimer, T. N. Palmer, J. M. Murphy and D. Smith. Forecast quality asessment of the ENSEMBLES seasonal-to-decadal stream 2 hindcasts, ECMWF Technical Memorandum, 621 (2010).

  3. Newsvendor problem under complete uncertainty: a case of innovative products.

    PubMed

    Gaspars-Wieloch, Helena

    2017-01-01

    The paper presents a new scenario-based decision rule for the classical version of the newsvendor problem (NP) under complete uncertainty (i.e. uncertainty with unknown probabilities). So far, NP has been analyzed under uncertainty with known probabilities or under uncertainty with partial information (probabilities known incompletely). The novel approach is designed for the sale of new, innovative products, where it is quite complicated to define probabilities or even probability-like quantities, because there are no data available for forecasting the upcoming demand via statistical analysis. The new procedure described in the contribution is based on a hybrid of Hurwicz and Bayes decision rules. It takes into account the decision maker's attitude towards risk (measured by coefficients of optimism and pessimism) and the dispersion (asymmetry, range, frequency of extremes values) of payoffs connected with particular order quantities. It does not require any information about the probability distribution.

  4. Movement patterns and study area boundaries: Influences on survival estimation in capture-mark-recapture studies

    USGS Publications Warehouse

    Horton, G.E.; Letcher, B.H.

    2008-01-01

    The inability to account for the availability of individuals in the study area during capture-mark-recapture (CMR) studies and the resultant confounding of parameter estimates can make correct interpretation of CMR model parameter estimates difficult. Although important advances based on the Cormack-Jolly-Seber (CJS) model have resulted in estimators of true survival that work by unconfounding either death or recapture probability from availability for capture in the study area, these methods rely on the researcher's ability to select a method that is correctly matched to emigration patterns in the population. If incorrect assumptions regarding site fidelity (non-movement) are made, it may be difficult or impossible as well as costly to change the study design once the incorrect assumption is discovered. Subtleties in characteristics of movement (e.g. life history-dependent emigration, nomads vs territory holders) can lead to mixtures in the probability of being available for capture among members of the same population. The result of these mixtures may be only a partial unconfounding of emigration from other CMR model parameters. Biologically-based differences in individual movement can combine with constraints on study design to further complicate the problem. Because of the intricacies of movement and its interaction with other parameters in CMR models, quantification of and solutions to these problems are needed. Based on our work with stream-dwelling populations of Atlantic salmon Salmo salar, we used a simulation approach to evaluate existing CMR models under various mixtures of movement probabilities. The Barker joint data model provided unbiased estimates of true survival under all conditions tested. The CJS and robust design models provided similarly unbiased estimates of true survival but only when emigration information could be incorporated directly into individual encounter histories. For the robust design model, Markovian emigration (future availability for capture depends on an individual's current location) was a difficult emigration pattern to detect unless survival and especially recapture probability were high. Additionally, when local movement was high relative to study area boundaries and movement became more diffuse (e.g. a random walk), local movement and permanent emigration were difficult to distinguish and had consequences for correctly interpreting the survival parameter being estimated (apparent survival vs true survival). ?? 2008 The Authors.

  5. On the Estimation of Disease Prevalence by Latent Class Models for Screening Studies Using Two Screening Tests with Categorical Disease Status Verified in Test Positives Only

    PubMed Central

    Chu, Haitao; Zhou, Yijie; Cole, Stephen R.; Ibrahim, Joseph G.

    2010-01-01

    Summary To evaluate the probabilities of a disease state, ideally all subjects in a study should be diagnosed by a definitive diagnostic or gold standard test. However, since definitive diagnostic tests are often invasive and expensive, it is generally unethical to apply them to subjects whose screening tests are negative. In this article, we consider latent class models for screening studies with two imperfect binary diagnostic tests and a definitive categorical disease status measured only for those with at least one positive screening test. Specifically, we discuss a conditional independent and three homogeneous conditional dependent latent class models and assess the impact of misspecification of the dependence structure on the estimation of disease category probabilities using frequentist and Bayesian approaches. Interestingly, the three homogeneous dependent models can provide identical goodness-of-fit but substantively different estimates for a given study. However, the parametric form of the assumed dependence structure itself is not “testable” from the data, and thus the dependence structure modeling considered here can only be viewed as a sensitivity analysis concerning a more complicated non-identifiable model potentially involving heterogeneous dependence structure. Furthermore, we discuss Bayesian model averaging together with its limitations as an alternative way to partially address this particularly challenging problem. The methods are applied to two cancer screening studies, and simulations are conducted to evaluate the performance of these methods. In summary, further research is needed to reduce the impact of model misspecification on the estimation of disease prevalence in such settings. PMID:20191614

  6. The cost effectiveness of acellular dermal matrix in expander-implant immediate breast reconstruction.

    PubMed

    Krishnan, Naveen M; Chatterjee, Abhishek; Rosenkranz, Kari M; Powell, Stephen G; Nigriny, John F; Vidal, Dale C

    2014-04-01

    Expander-implant breast reconstruction is often supplemented with acellular dermal matrix (ADM). The use of acellular dermal matrix has allowed for faster, less painful expansions and improved aesthetics, but with increased cost. Our goal was to provide the first cost utility analysis of using acellular dermal matrix in two-stage, expander-implant immediate breast reconstruction following mastectomy. A comprehensive literature review was conducted to identify complication rates for two-stage, expander-implant immediate breast reconstruction with and without acellular dermal matrix. The probabilities of the most common complications were combined with Medicare Current Procedural Terminology reimbursement codes and expert utility estimates to fit into a decision model. The decision model evaluated the cost effectiveness of acellular dermal matrix relative to reconstructions without it. Retail costs for ADM were derived from the LifeCell 2012 company catalogue for Alloderm. The overall complication rates were 30% and 34.5% with and without ADM. The decision model revealed a baseline cost increase of $361.96 when acellular dermal matrix is used. The increase in Quality-Adjusted Life Years (QALYs) is 1.37 in the population with acellular dermal matrix. This yields a cost effective incremental cost-utility ratio (ICUR) of $264.20/QALY. Univariate sensitivity analysis confirmed that using acellular dermal matrix is cost effective even when using retail costs for unilateral and bilateral reconstructions. Our study shows that, despite an increased cost, acellular dermal matrix is a cost effective technology for patients undergoing two-stage, expander-implant immediate breast reconstruction due to its increased utility in successful procedures. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  7. Complications of acromegaly: thyroid and colon.

    PubMed

    Tirosh, Amit; Shimon, Ilan

    2017-02-01

    In acromegaly the long-term exposure to high growth hormone (GH) and insulin-like growth factor-1 (IGF-1) levels may result in specific complications in different human organs, including the thyroid gland and the colon. We will review here the evidence available regarding the characteristic thyroid and colon complications in acromegaly. This review summarizes the published data observing noncancerous structural abnormalities (thyroid nodules, colonic polyps) and thyroid and colon cancer in patients diagnosed with acromegaly. Thyroid micro-carcinomas are probably over-diagnosed among acromegalic patients. In regard to colon cancer, there is no sufficient data to suggest that colon cancer risk is higher in acromegaly compared to the general population.

  8. Influence of early neurological complications on clinical outcome following lung transplant.

    PubMed

    Gamez, Josep; Salvado, Maria; Martinez-de La Ossa, Alejandro; Deu, Maria; Romero, Laura; Roman, Antonio; Sacanell, Judith; Laborda, Cesar; Rochera, Isabel; Nadal, Miriam; Carmona, Francesc; Santamarina, Estevo; Raguer, Nuria; Canela, Merce; Solé, Joan

    2017-01-01

    Neurological complications after lung transplantation are common. The full spectrum of neurological complications and their impact on clinical outcomes has not been extensively studied. We investigated the neurological incidence of complications, categorized according to whether they affected the central, peripheral or autonomic nervous systems, in a series of 109 patients undergoing lung transplantation at our center between January 1 2013 and December 31 2014. Fifty-one patients (46.8%) presented at least one neurological complication. Critical illness polyneuropathy-myopathy (31 cases) and phrenic nerve injury (26 cases) were the two most prevalent complications. These two neuromuscular complications lengthened hospital stays by a median period of 35.5 and 32.5 days respectively. However, neurological complications did not affect patients' survival. The real incidence of neurological complications among lung transplant recipients is probably underestimated. They usually appear in the first two months after surgery. Despite not affecting mortality, they do affect the mean length of hospital stay, and especially the time spent in the Intensive Care Unit. We found no risk factor for neurological complications except for long operating times, ischemic time and need for transfusion. It is necessary to develop programs for the prevention and early recognition of these complications, and the prevention of their precipitant and risk factors.

  9. Influence of early neurological complications on clinical outcome following lung transplant

    PubMed Central

    Salvado, Maria; Martinez-de La Ossa, Alejandro; Deu, Maria; Romero, Laura; Roman, Antonio; Sacanell, Judith; Laborda, Cesar; Rochera, Isabel; Nadal, Miriam; Carmona, Francesc; Santamarina, Estevo; Raguer, Nuria; Canela, Merce; Solé, Joan

    2017-01-01

    Background Neurological complications after lung transplantation are common. The full spectrum of neurological complications and their impact on clinical outcomes has not been extensively studied. Methods We investigated the neurological incidence of complications, categorized according to whether they affected the central, peripheral or autonomic nervous systems, in a series of 109 patients undergoing lung transplantation at our center between January 1 2013 and December 31 2014. Results Fifty-one patients (46.8%) presented at least one neurological complication. Critical illness polyneuropathy-myopathy (31 cases) and phrenic nerve injury (26 cases) were the two most prevalent complications. These two neuromuscular complications lengthened hospital stays by a median period of 35.5 and 32.5 days respectively. However, neurological complications did not affect patients’ survival. Conclusions The real incidence of neurological complications among lung transplant recipients is probably underestimated. They usually appear in the first two months after surgery. Despite not affecting mortality, they do affect the mean length of hospital stay, and especially the time spent in the Intensive Care Unit. We found no risk factor for neurological complications except for long operating times, ischemic time and need for transfusion. It is necessary to develop programs for the prevention and early recognition of these complications, and the prevention of their precipitant and risk factors. PMID:28301586

  10. Complications associated with brachytherapy alone or with laser in lung cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khanavkar, B.; Stern, P.; Alberti, W.

    1991-05-01

    Relatively little has been reported about destruction through brachytherapy of mucosa-perforating and extraluminary tumors with probable large vessel involvement causing major hemorrhagic or fistular complications. We report 12 patients subjected to laser and brachytherapy for centrally occluding lung cancer, whom we have periodically followed up from June 1986 until they died. Although all laser procedures were free from complications, necrotic cavitation in five cases, two of which were accompanied by large bronchoesophageal fistulas, and massive fatal hemoptysis occurred in six. Minor complications included radiation mucositis (two), noncritical mucosal scarring (two), and cough (four). Characteristics that will identify patients at riskmore » of developing fatal hemoptysis and fistulas should be better defined by imaging and endoscopic techniques. In such cases, modifying the protocol or using alternative procedures should be considered. Minor complications, such as cough, can be avoided by using topical steroid therapy (eg, beclomethasone dipropionate).« less

  11. Zidovudine for the prevention of vertical HIV transmission: a decision analytic approach.

    PubMed

    Rouse, D J; Owen, J; Goldenberg, R L; Vermund, S H

    1995-08-01

    The purpose of this study was to quantify the benefits of maternal-neonatal zidovudine (ZDV) administration for the prevention of vertical human immunodeficiency virus (HIV) transmission against the potential risks of drug-induced complications in uninfected children. A decision analysis model was created with use of a Markov cohort simulation, for evaluating both survival and quality of life for two hypothetical cohorts of HIV-exposed neonates: one with in utero and neonatal exposure to preventive ZDV therapy and the other not exposed. The model included the probability of congenital HIV infection with and without ZDV treatment (estimates derived from AIDS Clinical Trials Group study 076), the yearly probability of death with and without congenital HIV infection, a range of probabilities of adverse effects from ZDV use, and a range of ages in life when any adverse effect would manifest. In a series of scenarios, the impact of different estimates for the quality-of-life decrement from any adverse ZDV effect in HIV-uninfected children was assessed, and threshold values for this estimate were established, i.e., critical values below which withholding ZDV would be the preferred choice. Across a wide range of estimates for multiple contingencies, ZDV use was associated with a greater number of quality-adjusted life years than was non-use. Only in implausible, pessimistic scenarios (i.e., a high incidence of profound adverse effects beginning early in life) would withholding ZDV be the rational choice for an asymptomatic HIV-infected pregnant woman.(ABSTRACT TRUNCATED AT 250 WORDS)

  12. Tensor distribution function

    NASA Astrophysics Data System (ADS)

    Leow, Alex D.; Zhu, Siwei

    2008-03-01

    Diffusion weighted MR imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitizing gradients along a minimum of 6 directions, second-order tensors (represetnted by 3-by-3 positive definiite matrices) can be computed to model dominant diffusion processes. However, it has been shown that conventional DTI is not sufficient to resolve more complicated white matter configurations, e.g. crossing fiber tracts. More recently, High Angular Resolution Diffusion Imaging (HARDI) seeks to address this issue by employing more than 6 gradient directions. To account for fiber crossing when analyzing HARDI data, several methodologies have been introduced. For example, q-ball imaging was proposed to approximate Orientation Diffusion Function (ODF). Similarly, the PAS method seeks to reslove the angular structure of displacement probability functions using the maximum entropy principle. Alternatively, deconvolution methods extract multiple fiber tracts by computing fiber orientations using a pre-specified single fiber response function. In this study, we introduce Tensor Distribution Function (TDF), a probability function defined on the space of symmetric and positive definite matrices. Using calculus of variations, we solve for the TDF that optimally describes the observed data. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, ODF can easily be computed by analytical integration of the resulting displacement probability function. Moreover, principle fiber directions can also be directly derived from the TDF.

  13. Safety profile and complications of autologous limbal conjunctival transplantation for primary pterygium

    PubMed Central

    Prabhakar, Srinivasapuram Krishnachary

    2014-01-01

    Purpose Primary pterygium is a fibrovascular proliferation over the nasal cornea, probably resulting from the limbal stem cell deficiency. Intraoperative mitomycin-C application seems to associate with reduced recurrences, however produced ocular surface problems and vision threatening complications. The present clinical study investigated the safety profile of autologous limbal conjunctival transplantation in terms of recurrence rate, as the main outcome measure and complications as the secondary outcome. Methods The present study was randomised, interventional and prospective clinical study conducted from a tertiary Hospital. Pterygium excision was performed with limbal conjunctival autograft availed from the affected eye. Secondary pterygia resulting from inflammation, trauma and other diseases were excluded. Patients were followed up for 18 months for recurrence and other complications. Microsoft Office Excel 2007 was used for statistical analysis. Results A total of 71 eyes of sixty-eight patients with primary pterygia included between November 2007 and October 2010. The study recruited 35 (51%) males and 33 (49%) females with mean age of 36.9 with ±12.82 years standard deviation (mean, SD) ranging from 19 to 75 years. Age grouped by range intervals categorised into six groups. Pterygium was diagnosed in 32 (45%) right eyes and 39 (55%) left eyes. There were 65 (91.55%) nasal and 4 (5.63%) temporal pterygium and no case of double head pterygia found. Average horizontal extension of the pterygium measured was 1.67 mm (±4.23) from the apex to the corneal limbus. Graft oedema in 1 (0.71%) patient, graft bleed in 2 (1.42%) cases and 1 (0.72%) case of granuloma observed. No recurrences encountered during 18 months follow up. Conclusions Pterygium occurred predominantly in the younger population group 36.9 mm (±12.82) probably due to the increased outdoor activity with high exposure to sunlight and dusty atmosphere. Absence of recurrences was probably attributable to the smaller pterygium size of 1.67 mm (±4.23), use of the autologous limbal conjunctival graft and treatable intra and post operative complications successfully. PMID:25473341

  14. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126.

    PubMed

    Moore, Kevin L; Schmidt, Rachel; Moiseenko, Vitali; Olsen, Lindsey A; Tan, Jun; Xiao, Ying; Galvin, James; Pugh, Stephanie; Seider, Michael J; Dicker, Adam P; Bosch, Walter; Michalski, Jeff; Mutic, Sasa

    2015-06-01

    The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative to observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH0126,top10%). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed "high-quality," "low-quality," and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH0126,top10% to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Computation of turbulent reacting flow in a solid-propellant ducted rocket

    NASA Astrophysics Data System (ADS)

    Chao, Yei-Chin; Chou, Wen-Fuh; Liu, Sheng-Shyang

    1995-05-01

    A mathematical model for computation of turbulent reacting flows is developed under general curvilinear coordinate systems. An adaptive, streamline grid system is generated to deal with the complex flow structures in a multiple-inlet solid-propellant ducted rocket (SDR) combustor. General tensor representations of the k-epsilon and algebraic stress (ASM) turbulence models are derived in terms of contravariant velocity components, and modification caused by the effects of compressible turbulence is also included in the modeling. The clipped Gaussian probability density function is incorporated in the combustion model to account for fluctuations of properties. Validation of the above modeling is first examined by studying mixing and reacting characteristics in a confined coaxial-jet problem. This is followed by study of nonreacting and reacting SDR combustor flows. The results show that Gibson and Launder's ASM incorporated with Sarkar's modification for compressible turbulence effects based on the general curvilinear coordinate systems yields the most satisfactory prediction for this complicated SDR flowfield.

  16. Computation of turbulent reacting flow in a solid-propellant ducted rocket

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Y.; Chou, W.; Liu, S.

    1995-05-01

    A mathematical model for computation of turbulent reacting flows is developed under general curvilinear coordinate systems. An adaptive, streamline grid system is generated to deal with the complex flow structures in a multiple-inlet solid-propellant ducted rocket (SDR) combustor. General tensor representations of the k-epsilon and algebraic stress (ASM) turbulence models are derived in terms of contravariant velocity components, and modification caused by the effects of compressible turbulence is also included in the modeling. The clipped Gaussian probability density function is incorporated in the combustion model to account for fluctuations of properties. Validation of the above modeling is first examined bymore » studying mixing and reacting characteristics in a confined coaxial-jet problem. This is followed by study of nonreacting and reacting SDR combustor flows. The results show that Gibson and Launder`s ASM incorporated with Sarkar`s modification for compressible turbulence effects based on the general curvilinear coordinate systems yields the most satisfactory prediction for this complicated SDR flowfield. 36 refs.« less

  17. Clinical and sonographic risk factors and complications of shoulder dystocia - a case-control study with parity and gestational age matched controls.

    PubMed

    Parantainen, Jukka; Palomäki, Outi; Talola, Nina; Uotila, Jukka

    2014-06-01

    To examine the clinical risk factors and complications of shoulder dystocia today and to evaluate ultrasound methods predicting it. Retrospective, matched case-control study at a University Hospital with 5000 annual deliveries. The study population consisted of 152 deliveries complicated by shoulder dystocia over a period of 8.5 years (January 2004-June 2012) and 152 controls matched for gestational age and parity. The data was collected from the medical records of mothers and children and analyzed by conditional logistic regression. Incidences and odds ratios were calculated for risk factors and complications. Antenatal ultrasound data was analyzed when available by conditional logistic regression to test for significant differences between study groups. Birthweight (OR 12.1 for ≥4000 g; 95% CI 4.18-35.0) and vacuum extraction (OR 3.98; 95% CI 1.25-12.7) remained the most significant clinical risk factors. Only a trend of an association of pregestational or gestational diabetes was noticed (OR 1.87; 95% CI 0.997-3.495, probability of type II error 51%). Of the complications of shoulder dystocia the incidence of brachial plexus palsies was high (40%). Antenatal ultrasound method based on the difference between abdominal and biparietal diameters had a significant difference between cases and controls. The impact of diabetes as a risk factor has diminished, which may reflect improved screening and treatment. Antenatal ultrasound methods are showing some promise, but the predictive value of ultrasound alone is probably low. Copyright © 2014. Published by Elsevier Ireland Ltd.

  18. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP)

    PubMed Central

    Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-01-01

    Objective: The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). Methods: 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy–oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose–volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. Results: The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. Conclusion: The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. Advances in knowledge: The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans before treating the patients. PMID:25882689

  19. Improving Conceptual Models Using AEM Data and Probability Distributions

    NASA Astrophysics Data System (ADS)

    Davis, A. C.; Munday, T. J.; Christensen, N. B.

    2012-12-01

    With emphasis being placed on uncertainty in groundwater modelling and prediction, coupled with questions concerning the value of geophysical methods in hydrogeology, it is important to ask meaningful questions of hydrogeophysical data and inversion results. For example, to characterise aquifers using electromagnetic (EM) data, we ask questions such as "Given that the electrical conductivity of aquifer 'A' is less than x, where is that aquifer elsewhere in the survey area?" The answer may be given by examining inversion models, selecting locations and layers that satisfy the condition 'conductivity <= x', and labelling them as aquifer 'A'. One difficulty with this approach is that the inversion model result often be considered to be the only model for the data. In reality it is just one image of the subsurface that, given the method and the regularisation imposed in the inversion, agrees with measured data within a given error bound. We have no idea whether the final model realised by the inversion satisfies the global minimum error, or whether it is simply in a local minimum. There is a distribution of inversion models that satisfy the error tolerance condition: the final model is not the only one, nor is it necessarily the correct one. AEM inversions are often linearised in the calculation of the parameter sensitivity: we rely on the second derivatives in the Taylor expansion, thus the minimum model has all layer parameters distributed about their mean parameter value with well-defined variance. We investigate the validity of the minimum model, and its uncertainty, by examining the full posterior covariance matrix. We ask questions of the minimum model, and answer them in a probabilistically. The simplest question we can pose is "What is the probability that all layer resistivity values are <= a cut-off value?" We can calculate through use of the erf or the erfc functions. The covariance values of the inversion become marginalised in the integration: only the main diagonal is used. Complications arise when we ask more specific questions, such as "What is the probability that the resistivity of layer 2 <= x, given that layer 1 <= y?" The probability then becomes conditional, calculation includes covariance terms, the integration is taken over many dimensions, and the cross-correlation of parameters becomes important. To illustrate, we examine the inversion results of a Tempest AEM survey over the Uley Basin aquifers in the Eyre Peninsula, South Australia. Key aquifers include the unconfined Bridgewater Formation that overlies the Uley and Wanilla Formations, which contain Tertiary clays and Tertiary sandstone. These Formations overlie weathered basement which define the lower bound of the Uley Basin aquifer systems. By correlating the conductivity of the sub-surface Formation types, we pose questions such as: "What is the probability-depth of the Bridgewater Formation in the Uley South Basin?", "What is the thickness of the Uley Formation?" and "What is the most probable depth to basement?" We use these questions to generate improved conceptual hydrogeological models of the Uley Basin in order to develop better estimates of aquifer extent and the available groundwater resource.

  20. The development and validation of the AMPREDICT model for predicting mobility outcome after dysvascular lower extremity amputation.

    PubMed

    Czerniecki, Joseph M; Turner, Aaron P; Williams, Rhonda M; Thompson, Mary Lou; Landry, Greg; Hakimi, Kevin; Speckman, Rebecca; Norvell, Daniel C

    2017-01-01

    The objective of this study was the development of AMPREDICT-Mobility, a tool to predict the probability of independence in either basic or advanced (iBASIC or iADVANCED) mobility 1 year after dysvascular major lower extremity amputation. Two prospective cohort studies during consecutive 4-year periods (2005-2009 and 2010-2014) were conducted at seven medical centers. Multiple demographic and biopsychosocial predictors were collected in the periamputation period among individuals undergoing their first major amputation because of complications of peripheral arterial disease or diabetes. The primary outcomes were iBASIC and iADVANCED mobility, as measured by the Locomotor Capabilities Index. Combined data from both studies were used for model development and internal validation. Backwards stepwise logistic regression was used to develop the final prediction models. The discrimination and calibration of each model were assessed. Internal validity of each model was assessed with bootstrap sampling. Twelve-month follow-up was reached by 157 of 200 (79%) participants. Among these, 54 (34%) did not achieve iBASIC mobility, 103 (66%) achieved at least iBASIC mobility, and 51 (32%) also achieved iADVANCED mobility. Predictive factors associated with reduced odds of achieving iBASIC mobility were increasing age, chronic obstructive pulmonary disease, dialysis, diabetes, prior history of treatment for depression or anxiety, and very poor to fair self-rated health. Those who were white, were married, and had at least a high-school degree had a higher probability of achieving iBASIC mobility. The odds of achieving iBASIC mobility increased with increasing body mass index up to 30 kg/m 2 and decreased with increasing body mass index thereafter. The prediction model of iADVANCED mobility included the same predictors with the exception of diabetes, chronic obstructive pulmonary disease, and education level. Both models showed strong discrimination with C statistics of 0.85 and 0.82, respectively. The mean difference in predicted probabilities for those who did and did not achieve iBASIC and iADVANCED mobility was 33% and 29%, respectively. Tests for calibration and observed vs predicted plots suggested good fit for both models; however, the precision of the estimates of the predicted probabilities was modest. Internal validation through bootstrapping demonstrated some overoptimism of the original model development, with the optimism-adjusted C statistic for iBASIC and iADVANCED mobility being 0.74 and 0.71, respectively, and the discrimination slope 19% and 16%, respectively. AMPREDICT-Mobility is a user-friendly prediction tool that can inform the patient undergoing a dysvascular amputation and the patient's provider about the probability of independence in either basic or advanced mobility at each major lower extremity amputation level. Copyright © 2016 Society for Vascular Surgery. All rights reserved.

  1. Photon iso-effective dose for cancer treatment with mixed field radiation based on dose-response assessment from human and an animal model: clinical application to boron neutron capture therapy for head and neck cancer

    NASA Astrophysics Data System (ADS)

    González, S. J.; Pozzi, E. C. C.; Monti Hughes, A.; Provenzano, L.; Koivunoro, H.; Carando, D. G.; Thorp, S. I.; Casal, M. R.; Bortolussi, S.; Trivillin, V. A.; Garabalino, M. A.; Curotto, P.; Heber, E. M.; Santa Cruz, G. A.; Kankaanranta, L.; Joensuu, H.; Schwint, A. E.

    2017-10-01

    Boron neutron capture therapy (BNCT) is a treatment modality that combines different radiation qualities. Since the severity of biological damage following irradiation depends on the radiation type, a quantity different from absorbed dose is required to explain the effects observed in the clinical BNCT in terms of outcome compared with conventional photon radiation therapy. A new approach for calculating photon iso-effective doses in BNCT was introduced previously. The present work extends this model to include information from dose-response assessments in animal models and humans. Parameters of the model were determined for tumour and precancerous tissue using dose-response curves obtained from BNCT and photon studies performed in the hamster cheek pouch in vivo models of oral cancer and/or pre-cancer, and from head and neck cancer radiotherapy data with photons. To this end, suitable expressions of the dose-limiting Normal Tissue Complication and Tumour Control Probabilities for the reference radiation and for the mixed field BNCT radiation were developed. Pearson’s correlation coefficients and p-values showed that TCP and NTCP models agreed with experimental data (with r  >  0.87 and p-values  >0.57). The photon iso-effective dose model was applied retrospectively to evaluate the dosimetry in tumours and mucosa for head and neck cancer patients treated with BNCT in Finland. Photon iso-effective doses in tumour were lower than those obtained with the standard RBE-weighted model (between 10% to 45%). The results also suggested that the probabilities of tumour control derived from photon iso-effective doses are more adequate to explain the clinical responses than those obtained with the RBE-weighted values. The dosimetry in the mucosa revealed that the photon iso-effective doses were about 30% to 50% higher than the corresponding RBE-weighted values. While the RBE-weighted doses are unable to predict mucosa toxicity, predictions based on the proposed model are compatible with the observed clinical outcome. The extension of the photon iso-effective dose model has allowed, for the first time, the determination of the photon iso-effective dose for unacceptable complications in the dose-limiting normal tissue. Finally, the formalism developed in this work to compute photon-equivalent doses can be applied to other therapies that combine mixed radiation fields, such as hadron therapy.

  2. Photon iso-effective dose for cancer treatment with mixed field radiation based on dose-response assessment from human and an animal model: clinical application to boron neutron capture therapy for head and neck cancer.

    PubMed

    González, S J; Pozzi, E C C; Monti Hughes, A; Provenzano, L; Koivunoro, H; Carando, D G; Thorp, S I; Casal, M R; Bortolussi, S; Trivillin, V A; Garabalino, M A; Curotto, P; Heber, E M; Santa Cruz, G A; Kankaanranta, L; Joensuu, H; Schwint, A E

    2017-10-03

    Boron neutron capture therapy (BNCT) is a treatment modality that combines different radiation qualities. Since the severity of biological damage following irradiation depends on the radiation type, a quantity different from absorbed dose is required to explain the effects observed in the clinical BNCT in terms of outcome compared with conventional photon radiation therapy. A new approach for calculating photon iso-effective doses in BNCT was introduced previously. The present work extends this model to include information from dose-response assessments in animal models and humans. Parameters of the model were determined for tumour and precancerous tissue using dose-response curves obtained from BNCT and photon studies performed in the hamster cheek pouch in vivo models of oral cancer and/or pre-cancer, and from head and neck cancer radiotherapy data with photons. To this end, suitable expressions of the dose-limiting Normal Tissue Complication and Tumour Control Probabilities for the reference radiation and for the mixed field BNCT radiation were developed. Pearson's correlation coefficients and p-values showed that TCP and NTCP models agreed with experimental data (with r  >  0.87 and p-values  >0.57). The photon iso-effective dose model was applied retrospectively to evaluate the dosimetry in tumours and mucosa for head and neck cancer patients treated with BNCT in Finland. Photon iso-effective doses in tumour were lower than those obtained with the standard RBE-weighted model (between 10% to 45%). The results also suggested that the probabilities of tumour control derived from photon iso-effective doses are more adequate to explain the clinical responses than those obtained with the RBE-weighted values. The dosimetry in the mucosa revealed that the photon iso-effective doses were about 30% to 50% higher than the corresponding RBE-weighted values. While the RBE-weighted doses are unable to predict mucosa toxicity, predictions based on the proposed model are compatible with the observed clinical outcome. The extension of the photon iso-effective dose model has allowed, for the first time, the determination of the photon iso-effective dose for unacceptable complications in the dose-limiting normal tissue. Finally, the formalism developed in this work to compute photon-equivalent doses can be applied to other therapies that combine mixed radiation fields, such as hadron therapy.

  3. Hydronephrosis Resulting from Bilateral Ureteral Stenosis: A Late Complication of Polyoma BK Virus Cystitis?

    PubMed

    Basara, N; Rasche, F-M; Schwalenberg, T; Wickenhauser, C; Maier, M; Ivovic, J; Niederwieser, D; Lindner, T H

    2010-01-01

    We report here a case of acute lymphoblastic leukemia in remission presenting a late-onset bilateral hydronephrosis probably due to polyoma BK virus-induced proliferation of bladder endothelium on both ostii. The diagnosis was made virologically by BK virus Polymerase Chain Reaction (PCR) detection in the absence of any other bladder disease. Awareness of this late complication is necessary not only in patients after renal transplantation but also in patients after hematopoietic stem cell transplantation from matched unrelated donor.

  4. Hydronephrosis Resulting from Bilateral Ureteral Stenosis: A Late Complication of Polyoma BK Virus Cystitis?

    PubMed Central

    Basara, N.; Rasche, F.-M.; Schwalenberg, T.; Wickenhauser, C.; Maier, M.; Ivovic, J.; Niederwieser, D.; Lindner, T. H.

    2010-01-01

    We report here a case of acute lymphoblastic leukemia in remission presenting a late-onset bilateral hydronephrosis probably due to polyoma BK virus-induced proliferation of bladder endothelium on both ostii. The diagnosis was made virologically by BK virus Polymerase Chain Reaction (PCR) detection in the absence of any other bladder disease. Awareness of this late complication is necessary not only in patients after renal transplantation but also in patients after hematopoietic stem cell transplantation from matched unrelated donor. PMID:20936157

  5. Pyroclastic density current hazard maps at Campi Flegrei caldera (Italy): the effects of event scale, vent location and time forecasts.

    NASA Astrophysics Data System (ADS)

    Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina

    2016-04-01

    Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.

  6. The Anatomy of American Football: Evidence from 7 Years of NFL Game Data

    PubMed Central

    Papalexakis, Evangelos

    2016-01-01

    How much does a fumble affect the probability of winning an American football game? How balanced should your offense be in order to increase the probability of winning by 10%? These are questions for which the coaching staff of National Football League teams have a clear qualitative answer. Turnovers are costly; turn the ball over several times and you will certainly lose. Nevertheless, what does “several” mean? How “certain” is certainly? In this study, we collected play-by-play data from the past 7 NFL seasons, i.e., 2009–2015, and we build a descriptive model for the probability of winning a game. Despite the fact that our model incorporates simple box score statistics, such as total offensive yards, number of turnovers etc., its overall cross-validation accuracy is 84%. Furthermore, we combine this descriptive model with a statistical bootstrap module to build FPM (short for Football Prediction Matchup) for predicting future match-ups. The contribution of FPM is pertinent to its simplicity and transparency, which however does not sacrifice the system’s performance. In particular, our evaluations indicate that our prediction engine performs on par with the current state-of-the-art systems (e.g., ESPN’s FPI and Microsoft’s Cortana). The latter are typically proprietary but based on their components described publicly they are significantly more complicated than FPM. Moreover, their proprietary nature does not allow for a head-to-head comparison in terms of the core elements of the systems but it should be evident that the features incorporated in FPM are able to capture a large percentage of the observed variance in NFL games. PMID:28005971

  7. Effectiveness of early detection on breast cancer mortality reduction in Catalonia (Spain)

    PubMed Central

    2009-01-01

    Background At present, it is complicated to use screening trials to determine the optimal age intervals and periodicities of breast cancer early detection. Mathematical models are an alternative that has been widely used. The aim of this study was to estimate the effect of different breast cancer early detection strategies in Catalonia (Spain), in terms of breast cancer mortality reduction (MR) and years of life gained (YLG), using the stochastic models developed by Lee and Zelen (LZ). Methods We used the LZ model to estimate the cumulative probability of death for a cohort exposed to different screening strategies after T years of follow-up. We also obtained the cumulative probability of death for a cohort with no screening. These probabilities were used to estimate the possible breast cancer MR and YLG by age, period and cohort of birth. The inputs of the model were: incidence of, mortality from and survival after breast cancer, mortality from other causes, distribution of breast cancer stages at diagnosis and sensitivity of mammography. The outputs were relative breast cancer MR and YLG. Results Relative breast cancer MR varied from 20% for biennial exams in the 50 to 69 age interval to 30% for annual exams in the 40 to 74 age interval. When strategies differ in periodicity but not in the age interval of exams, biennial screening achieved almost 80% of the annual screening MR. In contrast to MR, the effect on YLG of extending screening from 69 to 74 years of age was smaller than the effect of extending the screening from 50 to 45 or 40 years. Conclusion In this study we have obtained a measure of the effect of breast cancer screening in terms of mortality and years of life gained. The Lee and Zelen mathematical models have been very useful for assessing the impact of different modalities of early detection on MR and YLG in Catalonia (Spain). PMID:19754959

  8. MO-G-304-01: FEATURED PRESENTATION: Expanding the Knowledge Base for Data-Driven Treatment Planning: Incorporating Patient Outcome Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, SP; Quon, H; Cheng, Z

    2015-06-15

    Purpose: To extend the capabilities of knowledge-based treatment planning beyond simple dose queries by incorporating validated patient outcome models. Methods: From an analytic, relational database of 684 head and neck cancer patients, 372 patients were identified having dose data for both left and right parotid glands as well as baseline and follow-up xerostomia assessments. For each existing patient, knowledge-based treatment planning was simulated for by querying the dose-volume histograms and geometric shape relationships (overlap volume histograms) for all other patients. Dose predictions were captured at normalized volume thresholds (NVT) of 0%, 10%, 20, 30%, 40%, 50%, and 85% and weremore » compared with the actual achieved doses using the Wilcoxon signed-rank test. Next, a logistic regression model was used to predict the maximum severity of xerostomia up to three months following radiotherapy. Baseline xerostomia scores were subtracted from follow-up assessments and were also included in the model. The relative risks from predicted doses and actual doses were computed and compared. Results: The predicted doses for both parotid glands were significantly less than the achieved doses (p < 0.0001), with differences ranging from 830 cGy ± 1270 cGy (0% NVT) to 1673 cGy ± 1197 cGy (30% NVT). The modelled risk of xerostomia ranged from 54% to 64% for achieved doses and from 33% to 51% for the dose predictions. Relative risks varied from 1.24 to 1.87, with maximum relative risk occurring at 85% NVT. Conclusions: Data-driven generation of treatment planning objectives without consideration of the underlying normal tissue complication probability may Result in inferior plans, even if quality metrics indicate otherwise. Inclusion of complication models in knowledge-based treatment planning is necessary in order to close the feedback loop between radiotherapy treatments and patient outcomes. Future work includes advancing and validating complication models in the context of knowledge-based treatment planning. This work is supported by Philips Radiation Oncology Systems.« less

  9. Network anomaly detection system with optimized DS evidence theory.

    PubMed

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network-complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each sensor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly.

  10. Network Anomaly Detection System with Optimized DS Evidence Theory

    PubMed Central

    Liu, Yuan; Wang, Xiaofeng; Liu, Kaiyu

    2014-01-01

    Network anomaly detection has been focused on by more people with the fast development of computer network. Some researchers utilized fusion method and DS evidence theory to do network anomaly detection but with low performance, and they did not consider features of network—complicated and varied. To achieve high detection rate, we present a novel network anomaly detection system with optimized Dempster-Shafer evidence theory (ODS) and regression basic probability assignment (RBPA) function. In this model, we add weights for each senor to optimize DS evidence theory according to its previous predict accuracy. And RBPA employs sensor's regression ability to address complex network. By four kinds of experiments, we find that our novel network anomaly detection model has a better detection rate, and RBPA as well as ODS optimization methods can improve system performance significantly. PMID:25254258

  11. Large-cell Monte Carlo renormalization of irreversible growth processes

    NASA Technical Reports Server (NTRS)

    Nakanishi, H.; Family, F.

    1985-01-01

    Monte Carlo sampling is applied to a recently formulated direct-cell renormalization method for irreversible, disorderly growth processes. Large-cell Monte Carlo renormalization is carried out for various nonequilibrium problems based on the formulation dealing with relative probabilities. Specifically, the method is demonstrated by application to the 'true' self-avoiding walk and the Eden model of growing animals for d = 2, 3, and 4 and to the invasion percolation problem for d = 2 and 3. The results are asymptotically in agreement with expectations; however, unexpected complications arise, suggesting the possibility of crossovers, and in any case, demonstrating the danger of using small cells alone, because of the very slow convergence as the cell size b is extrapolated to infinity. The difficulty of applying the present method to the diffusion-limited-aggregation model, is commented on.

  12. Complications of vessel architecture and the the reason that cylindrical electrodes are generally not effective

    NASA Astrophysics Data System (ADS)

    Pearce, John; Thomsen, Sharon

    2017-02-01

    Large vessels can be reliably sealed with radio frequency current. High apposition pressures are necessary to ensure a high probability of a successful seal. However, the complex architecture of the vessels, particularly arteries, means that results can vary substantially even with similar thermal histories. The relative volume fractions and spatial distributions of collagen, elastin, and smooth muscle dominate the vessel function in vivo and can even vary from proximal to distal locations in the same vessel. We begin by reviewing the architectural features characteristic of porcine and canine large vessels and conclude with an experimental and numerical modeling demonstration of the reasons why cylindrical electrodes are a sub-optimal choice.

  13. Esophageal Perforation After Transesophageal Echocardiogram.

    PubMed

    Shapira, Michael Y.; Hirshberg, Boaz; Agid, Ronit; Zuckerman, Elena; Caraco, Yoseph

    1999-02-01

    Esophageal rupture after transesophageal echocardiogram (TEE) is a rare but life-threatening complication. Risk factors for perforation include spasm or hypertrophy of the cricopharyngeal sphincter, cervical arthritis, forward and left lateral bending of the distal esophagus, and esophageal disease such as inflammation or neoplasm. We present the case of a 80-year-old woman who developed perforation of her esophagus after TEE. Prior irradiation to the chest due to treatment for breast cancer and subsequent fibrosis probably contributed to this complication. Physicians referring patients for a TEE and physicians performing this procedure should be aware for the risk of perforation. The identification of risk factors and gentle maneuvering of the probe may prevent this severe, life-threatening complication.

  14. A geometric model for evaluating the effects of inter-fraction rectal motion during prostate radiotherapy

    NASA Astrophysics Data System (ADS)

    Pavel-Mititean, Luciana M.; Rowbottom, Carl G.; Hector, Charlotte L.; Partridge, Mike; Bortfeld, Thomas; Schlegel, Wolfgang

    2004-06-01

    A geometric model is presented which allows calculation of the dosimetric consequences of rectal motion in prostate radiotherapy. Variations in the position of the rectum are measured by repeat CT scanning during the courses of treatment of five patients. Dose distributions are calculated by applying the same conformal treatment plan to each imaged fraction and rectal dose-surface histograms produced. The 2D model allows isotropic expansion and contraction in the plane of each CT slice. By summing the dose to specific volume elements tracked by the model, composite dose distributions are produced that explicitly include measured inter-fraction motion for each patient. These are then used to estimate effective dose-surface histograms (DSHs) for the entire treatment. Results are presented showing the magnitudes of the measured target and rectal motion and showing the effects of this motion on the integral dose to the rectum. The possibility of using such information to calculate normal tissue complication probabilities (NTCP) is demonstrated and discussed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Shu-Xia; Zhang, Yu-Ru; Research Group PLASMANT, Department of Chemistry, University of Antwerp, Universiteitsplein 1, B-2610 Antwerp

    A hybrid model is used to investigate the fragmentation of C{sub 4}F{sub 8} inductive discharges. Indeed, the resulting reactive species are crucial for the optimization of the Si-based etching process, since they determine the mechanisms of fluorination, polymerization, and sputtering. In this paper, we present the dissociation degree, the density ratio of F vs. C{sub x}F{sub y} (i.e., fluorocarbon (fc) neutrals), the neutral vs. positive ion density ratio, details on the neutral and ion components, and fractions of various fc neutrals (or ions) in the total fc neutral (or ion) density in a C{sub 4}F{sub 8} inductively coupled plasma source,more » as well as the effect of pressure and power on these results. To analyze the fragmentation behavior, the electron density and temperature and electron energy probability function (EEPF) are investigated. Moreover, the main electron-impact generation sources for all considered neutrals and ions are determined from the complicated C{sub 4}F{sub 8} reaction set used in the model. The C{sub 4}F{sub 8} plasma fragmentation is explained, taking into account many factors, such as the EEPF characteristics, the dominance of primary and secondary processes, and the thresholds of dissociation and ionization. The simulation results are compared with experiments from literature, and reasonable agreement is obtained. Some discrepancies are observed, which can probably be attributed to the simplified polymer surface kinetics assumed in the model.« less

  16. A Representation for Gaining Insight into Clinical Decision Models

    PubMed Central

    Jimison, Holly B.

    1988-01-01

    For many medical domains uncertainty and patient preferences are important components of decision making. Decision theory is useful as a representation for such medical models in computer decision aids, but the methodology has typically had poor performance in the areas of explanation and user interface. The additional representation of probabilities and utilities as random variables serves to provide a framework for graphical and text insight into complicated decision models. The approach allows for efficient customization of a generic model that describes the general patient population of interest to a patient- specific model. Monte Carlo simulation is used to calculate the expected value of information and sensitivity for each model variable, thus providing a metric for deciding what to emphasize in the graphics and text summary. The computer-generated explanation includes variables that are sensitive with respect to the decision or that deviate significantly from what is typically observed. These techniques serve to keep the assessment and explanation of the patient's decision model concise, allowing the user to focus on the most important aspects for that patient.

  17. Postoperative complications of contemporary open and robot-assisted laparoscopic radical prostatectomy using standardized reporting systems.

    PubMed

    Pompe, Raisa S; Beyer, Burkhard; Haese, Alexander; Preisser, Felix; Michl, Uwe; Steuber, Thomas; Graefen, Markus; Huland, Hartwig; Karakiewicz, Pierre I; Tilki, Derya

    2018-05-04

    To analyze time trends and contemporary rates of postoperative complications after RP and to compare the complication profile of ORP and RALP using standardized reporting systems. Retrospective analysis of 13,924 RP patients in a single institution (2005 to 2015). Complications were collected during hospital stay and via standardized questionnaire 3 months after and grouped into eight schemes. Since 2013, the revised Clavien-Dindo classification was used (n = 4,379). Annual incidence rates of different complications were graphically displayed. Multivariable logistic regression analyses compared complications between ORP and RALP after inverse probability of treatment weighting (IPTW). After introduction of standardized classification systems, complication rates have increased with a contemporary rate of 20.6% (2013 - 2015). While minor Clavien-Dindo grades represented the majority (I: 10.6%; II: 7.9%), severe complications (grades IV-V) were rare (<1%). In logistic regression analyses after IPTW, RALP was associated with less blood loss, shorter catheterization time and lower risk for Clavien-Dindo grade II and III complications. Our results emphasize the importance of standardized reporting systems for quality control and comparison across approaches or institutions. Contemporary complication rates in a high volume center remain low and are most frequently minor Clavien-Dindo grades. RALP had a slightly better complication profile compared to ORP. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  18. [Spontaneous hemothorax: a rare complication of neurofibromatosis type 1].

    PubMed

    Fdil, Soumia; Bouchikhi, Saad; Bourkadi, Jamal-Eddine

    2017-01-01

    Neurofibromatosis type 1 (NF1), also known as Von Recklinghausen's disease is an autosomal dominant genetic disorder. It is the most common of phacomatoses. Pulmonary complications have been rarely described in the literature. Vascular complications have been reported in 3.6% of patients. We here report the case of a 38-year old female patient, followed-up for neurofibromatosis type 1, admitted to the Emergency Department with hemorrhagic shock. Clinical examination showed several coffee-with-milk colored spots, many plexiform neurofibromas, left-sided pleural effusion syndrome. Pleural puncture objectified coagulable haemorrhagic fluid. The patient received transfusion and emergency chest drainage. Patient's assessment was completed by angioscanner which showed no pulmonary embolism or other associated lesions. Spontaneous hemothorax is a rare and severe complication of neurofibromatosis. It is probably due to vascular injury caused by this disease.

  19. [Gastroplasty: complications and their prevention].

    PubMed

    Schlienger, J L; Meyer, L; Rohr, S; Pradignac, A; Perrin, A E; Meyer, C; Simon, C

    2003-02-01

    Bariatric surgery is now frequently proposed for the treatment of morbid or complicated obesity since the introduction of minimally invasive laparoscopic anti-obesity operations such as the adjustable silicone gastric binding gastroplasty. However this reversible procedure in not always as safe as presumed and the results in weight loss may be sometimes disappointing. Side effects are common and early or late complications occured in more than 20% out of the patients. They are favoured by post operative eating disorders. Nutritional consequences are probably underestimated and are not limited to uncomfortable digestive symptoms. Some deficiencies in micronutriments have been described. The worsening of previous eating disorders or psychosocial abnormalities are not seldom. Gastroplasty is not an harmless procedure. A good selection in patients, a regular follow up, nutritional advices and psychosocial management by a multidisciplinar team are required to reduce complications after gastroplasty.

  20. Radiation-induced complications in prostate cancer patients treated with radiotherapy

    NASA Astrophysics Data System (ADS)

    Azuddin, A. Yusof; Rahman, I. Abdul; Siah, N. J.; Mohamed, F.; Saadc, M.; Ismail, F.

    2014-09-01

    The purpose of the study is to determine the relationship between radiation-induced complications with dosimetric and radiobiological parameters for prostate cancer patients that underwent the conformal radiotherapy treatment. 17 prostate cancer patients that have been treated with conformal radiotherapy were retrospectively analysed. The dosimetric data was retrieved in the form of dose-volume histogram (DVH) from Radiotherapy Treatment Planning System. The DVH was utilised to derived Normal Tissue Complication Probability (NTCP) in radiobiological data. Follow-up data from medical records were used to grade the occurrence of acute gastrointestinal (GI) and genitourinary (GU) complications using Radiation Therapy Oncology Group (RTOG) scoring system. The chi-square test was used to determine the relationship between radiation-induced complication with dosimetric and radiobiological parameters. 8 (47%) and 7 (41%) patients were having acute GI and GU complications respectively. The acute GI complication can be associated with V60rectum, rectal mean dose and NTCPrectum with p-value of 0.016, 0.038 and 0.049 respectively. There are no significant relationships of acute GU complication with dosimetric and radiobiological variables. Further study can be done by increase the sample size and follow up duration for deeper understanding of the factors that effecting the GU and GI complication in prostate cancer radiotherapy.

  1. Open EFTs, IR effects & late-time resummations: systematic corrections in stochastic inflation

    DOE PAGES

    Burgess, C. P.; Holman, R.; Tasinato, G.

    2016-01-26

    Though simple inflationary models describe the CMB well, their corrections are often plagued by infrared effects that obstruct a reliable calculation of late-time behaviour. Here we adapt to cosmology tools designed to address similar issues in other physical systems with the goal of making reliable late-time inflationary predictions. The main such tool is Open EFTs which reduce in the inflationary case to Stochastic Inflation plus calculable corrections. We apply this to a simple inflationary model that is complicated enough to have dangerous IR behaviour yet simple enough to allow the inference of late-time behaviour. We find corrections to standard Stochasticmore » Inflationary predictions for the noise and drift, and we find these corrections ensure the IR finiteness of both these quantities. The late-time probability distribution, P(Φ), for super-Hubble field fluctuations are obtained as functions of the noise and drift and so these too are IR finite. We compare our results to other methods (such as large-N models) and find they agree when these models are reliable. In all cases we can explore in detail we find IR secular effects describe the slow accumulation of small perturbations to give a big effect: a significant distortion of the late-time probability distribution for the field. But the energy density associated with this is only of order H 4 at late times and so does not generate a dramatic gravitational back-reaction.« less

  2. Open EFTs, IR effects & late-time resummations: systematic corrections in stochastic inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgess, C. P.; Holman, R.; Tasinato, G.

    Though simple inflationary models describe the CMB well, their corrections are often plagued by infrared effects that obstruct a reliable calculation of late-time behaviour. Here we adapt to cosmology tools designed to address similar issues in other physical systems with the goal of making reliable late-time inflationary predictions. The main such tool is Open EFTs which reduce in the inflationary case to Stochastic Inflation plus calculable corrections. We apply this to a simple inflationary model that is complicated enough to have dangerous IR behaviour yet simple enough to allow the inference of late-time behaviour. We find corrections to standard Stochasticmore » Inflationary predictions for the noise and drift, and we find these corrections ensure the IR finiteness of both these quantities. The late-time probability distribution, P(Φ), for super-Hubble field fluctuations are obtained as functions of the noise and drift and so these too are IR finite. We compare our results to other methods (such as large-N models) and find they agree when these models are reliable. In all cases we can explore in detail we find IR secular effects describe the slow accumulation of small perturbations to give a big effect: a significant distortion of the late-time probability distribution for the field. But the energy density associated with this is only of order H 4 at late times and so does not generate a dramatic gravitational back-reaction.« less

  3. Breeding for polled dairy cows versus dehorning: Preliminary cost assessments & discussion

    USDA-ARS?s Scientific Manuscript database

    Dairy producers today face costs associated with dehorning heifers, including labor, equipment, and medications. Further, costs associated with decreased feed intake immediately following dehorning and possible complications requiring a veterinarian or antibiotics occur with some probability. The ob...

  4. Complications of stent placement in patients with esophageal cancer: A systematic review and network meta-analysis

    PubMed Central

    Doosti-Irani, Amin; Mansournia, Mohammad Ali; Rahimi-Foroushani, Abbas; Haddad, Peiman

    2017-01-01

    Background Palliative treatments and stents are necessary for relieving dysphagia in patients with esophageal cancer. The aim of this study was to simultaneously compare available treatments in terms of complications. Methods Web of Science, Medline, Scopus, Cochrane Library and Embase were searched. Statistical heterogeneity was assessed using the Chi2 test and was quantified by I2. The results of this study were summarized in terms of Risk Ratio (RR). The random effects model was used to report the results. The rank probability for each treatment was calculated using the p-score. Results Out of 17855 references, 24 RCTs reported complications including treatment related death (TRD), bleeding, stent migration, aspiration, severe pain and fistula formation. In the ranking of treatments, thermal ablative therapy (p-score = 0.82), covered Evolution® stent (p-score = 0.70), brachytherapy (p-score = 0.72) and antireflux stent (p-score = 0.74) were better treatments in the network of TRD. Thermal ablative therapy (p-score = 0.86), the conventional stent (p-score = 0.62), covered Evolution® stent (p-score = 0.96) and brachytherapy (p-score = 0.82) were better treatments in the network of bleeding complications. Covered Evolution® (p-score = 0.78), uncovered (p-score = 0.88) and irradiation stents (p-score = 0.65) were better treatments in network of stent migration complications. In the network of severe pain, Conventional self-expandable nitinol alloy covered stent (p-score = 0.73), polyflex (p-score = 0.79), latex prosthesis (p-score = 0.96) and brachytherapy (p-score = 0.65) were better treatments. Conclusion According to our results, thermal ablative therapy, covered Evolution® stents, brachytherapy, and antireflux stents are associated with a lower risk of TRD. Moreover, thermal ablative therapy, conventional, covered Evolution® and brachytherapy had lower risks of bleeding. Overall, fewer complications were associated with covered Evolution® stent and brachytherapy. PMID:28968416

  5. Association between severity of untreated sleep apnoea and postoperative complications following major cardiac surgery: a prospective observational cohort study.

    PubMed

    Mason, Martina; Hernández Sánchez, Jules; Vuylsteke, Alain; Smith, Ian

    2017-09-01

    To examine whether untreated sleep apnoea is associated with prolonged Intensive Care Unit (ICU) stay and increased frequency of postoperative ICU complications, in patients undergoing major cardiac surgery. Adult patients, undergoing elective coronary artery bypass grafting with or without cardiac valve surgery, between March 2013 and July 2014, were considered. We excluded patients participating in other interventional studies, those who had a tracheostomy before surgery, required emergency surgery or were due to be admitted on the day of surgery. Patients underwent inpatient overnight oximetry on the night prior to their surgery to assess for the presence of sleep apnoea. Since oximetry alone cannot differentiate obstructive from central apnoea, the results are reported as sleep apnoea which was diagnosed in patients with an arterial oxygen desaturation index (ODI) ≥ 5/h. The primary outcome measure was length of stay (LoS) in ICU in days. The secondary outcome was a composite measure of postoperative complications in ICU. Multivariate models were developed to assess associations between ODI and the primary and secondary outcome measures, adjusting for preselected predictor variables, relative to primary and secondary outcomes. There was no significant association between ODI and ICU LoS, HR 1.0, 95% CI 0.99-1.02; p = 0.12. However we did find a significant association between ODI and postoperative complications in the ICU, OR = 1.1; 95% CI 1.02-1.17; p = 0.014. The probability of developing complications rose with higher ODI, reflecting sleep apnoea severity. Acknowledging the limitations of this prospective study, untreated sleep apnoea did not predict an increased length of stay in ICU but we do report an association with postoperative complications in patients undergoing major cardiac surgery. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Economic impact of angioplasty salvage techniques, with an emphasis on coronary stents: a method incorporating costs, revenues, clinical effectiveness and payer mix.

    PubMed

    Vaitkus, P T; Witmer, W T; Brandenburg, R G; Wells, S K; Zehnacker, J B

    1997-10-01

    We sought to broaden assessment of the economic impact of percutaneous transluminal coronary angioplasty (PTCA) revascularization salvage strategies by taking into account costs, revenues, the off-setting effects of prevented clinical complications and the effects of payer mix. Previous economic analyses of PTCA have focused on the direct costs of treatment but have not accounted either for associated revenues or for the ability of costly salvage techniques such as coronary stenting to reduce even costlier complications. Procedural costs, revenues and contribution margins (i.e., "profit") were measured for 765 consecutive PTCA cases to assess the economic impact of salvage techniques (prolonged heparin administration, thrombolysis, intracoronary stenting or use of perfusion balloon catheters) and clinical complications (myocardial infarction, coronary artery bypass graft surgery [CABG] or acute vessel closure with repeat PTCA). To assess the economic impact of various salvage techniques for failed PTCA, we used actual 1995 financial data as well as models of various mixes of fee-for-service, diagnosis-related group (DRG) and capitated payers. Under fee-for-service arrangements, most salvage techniques were profitable for the hospital. Stents were profitable at almost any level of clinical effectiveness. Under DRG-based systems, most salvage techniques such as stenting produced a financial loss to the hospital because one complication (CABG) remained profitable. Under capitated arrangements, stenting and other salvage modalities were profitable only if they were clinically effective in preventing complications in > 50% of cases in which they were used. The economic impact of PTCA salvage techniques depends on their clinical effectiveness, costs and revenues. In reimbursement systems dominated by DRG payers, salvage techniques are not rewarded, whereas complications are. Under capitated systems, the level of clinical effectiveness needed to achieve cost savings is probably not achievable in current practice. Further studies are needed to define equitable reimbursement schedules that will promote clinically effective practice.

  7. Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?

    PubMed

    Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian

    2016-09-01

    The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Determining the Best Treatment for Coronal Angular Deformity of the Knee Joint in Growing Children: A Decision Analysis

    PubMed Central

    Sung, Ki Hyuk; Chung, Chin Youb; Lee, Kyoung Min; Lee, Seung Yeol; Choi, In Ho; Cho, Tae-Joon; Yoo, Won Joon; Park, Moon Seok

    2014-01-01

    This study aimed to determine the best treatment modality for coronal angular deformity of the knee joint in growing children using decision analysis. A decision tree was created to evaluate 3 treatment modalities for coronal angular deformity in growing children: temporary hemiepiphysiodesis using staples, percutaneous screws, or a tension band plate. A decision analysis model was constructed containing the final outcome score, probability of metal failure, and incomplete correction of deformity. The final outcome was defined as health-related quality of life and was used as a utility in the decision tree. The probabilities associated with each case were obtained by literature review, and health-related quality of life was evaluated by a questionnaire completed by 25 pediatric orthopedic experts. Our decision analysis model favored temporary hemiepiphysiodesis using a tension band plate over temporary hemiepiphysiodesis using percutaneous screws or stapling, with utilities of 0.969, 0.957, and 0.962, respectively. One-way sensitivity analysis showed that hemiepiphysiodesis using a tension band plate was better than temporary hemiepiphysiodesis using percutaneous screws, when the overall complication rate of hemiepiphysiodesis using a tension band plate was lower than 15.7%. Two-way sensitivity analysis showed that hemiepiphysiodesis using a tension band plate was more beneficial than temporary hemiepiphysiodesis using percutaneous screws. PMID:25276801

  9. Therapy operating characteristic curves: tools for precision chemotherapy

    PubMed Central

    Barrett, Harrison H.; Alberts, David S.; Woolfenden, James M.; Caucci, Luca; Hoppin, John W.

    2016-01-01

    Abstract. The therapy operating characteristic (TOC) curve, developed in the context of radiation therapy, is a plot of the probability of tumor control versus the probability of normal-tissue complications as the overall radiation dose level is varied, e.g., by varying the beam current in external-beam radiotherapy or the total injected activity in radionuclide therapy. This paper shows how TOC can be applied to chemotherapy with the administered drug dosage as the variable. The area under a TOC curve (AUTOC) can be used as a figure of merit for therapeutic efficacy, analogous to the area under an ROC curve (AUROC), which is a figure of merit for diagnostic efficacy. In radiation therapy, AUTOC can be computed for a single patient by using image data along with radiobiological models for tumor response and adverse side effects. The mathematical analogy between response of observers to images and the response of tumors to distributions of a chemotherapy drug is exploited to obtain linear discriminant functions from which AUTOC can be calculated. Methods for using mathematical models of drug delivery and tumor response with imaging data to estimate patient-specific parameters that are needed for calculation of AUTOC are outlined. The implications of this viewpoint for clinical trials are discussed. PMID:27175376

  10. The value of fine needle aspiration and cytologic examination of impalpable complicated breast cysts.

    PubMed

    Tez, Selda; Dener, Cenap; Köktener, Aslý; Caydere, Muzaffer; Tez, Mesut

    2008-01-01

    The purpose of the study was to evaluate the utility of fine needle aspiration--FNA and cytologic analysis of impalpable complicated breast cysts. We rewieved the imaging findings, aspiration, cytology and biopsy results and followup imaging findings of 246 complicated cysts in 166 women retrospectively. FNA was performed in 169 out of the 246 complicated cysts. Thirtyone lesions were followed-up with US. Surgical biopsy was performed from five lesions. No malignant cells (137 cysts), insufficient cellular material (17 cysts), atypical cells (4 cysts) were seen in cytological examination of the aspirates. None of these lesions were found to represent malignancy at the time of surgical excision and during follow-up. Impalpable complicated breast cysts may be classified as probably benign and can be managed with follow-up imaging studies instead of intervention. Routine cytologic examination is unnecessary if the fluid is not bloody (Tab. 2, Ref. 18). Full Text (Free, PDF) www.bmj.sk.

  11. WE-AB-207B-01: Dose Tolerance for SBRT/SABR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimm, J

    Purpose: Stereotactic body radiation therapy (SBRT) / stereotactic ablative body radiotherapy (SABR) is gaining popularity, but quantitative dose tolerance has still been lacking. To improve this, the April 2016 issue of Seminars in Radiation Oncology will have normal tissue complication probability (NTCP) models for 10 critical structures: optic pathway, cochlea, oral mucosa, esophagus, chestwall, aorta, bronchi, duodenum, small bowel, and spinal cord. Methods: The project included more than 1500 treatments in 1–5 fractions using CyberKnife, Gamma Knife, or LINAC, with 60 authors from 15 institutions. NTCP models were constructed from the 97 grade 2–3 complications, predominantly scored using the commonmore » terminology criteria for adverse events (CTCAEv4). Dose volume histogram (DVH) data from each institutional dataset was loaded into the DVH Evaluator software (DiversiLabs, LLC, Huntingdon Valley, Pa) for modeling. The current state of the literature for the critical structures was depicted using DVH Risk Maps: comparative graphs of dose tolerance limits that can include estimated risk levels, reported complications, DVH data for study patients, as well as high- and low-risk dose tolerance limits. Results: For relatively acceptable toxicity like grade 1–3 rib fractures and chestwall pain, the high-risk limits have 50% risk and the low-risk limits have 5% risk. Emami et al (IJROBP 1991 May 15;21(1):109–22) used 50% and 5% risk levels for all structures, whereas this effort used clinically acceptable ranges for each: in structures like aorta or spinal cord where complications must be avoided, the high- and low-risk limits have about 3% and 1% risk, respectively, in this issue of Seminars. These statistically based guidelines can help ensure plan quality for each patient. Conclusion: NTCP for SBRT is now becoming available. Hypofractionated dose tolerance can be dramatically different than extrapolations of conventional fractionation so NTCP analysis of the SBRT/SBRT data is important to ensure safe clinical practice. Dr. Grimm, designed and holds intellectual property rights to the DVH Evaluator software tool which is an FDA-cleared product in commercial use, and was used to analyze the data.« less

  12. Cost-effectiveness analysis of dose-dense versus standard intravenous chemotherapy for ovarian cancer: An economic analysis of results from the Gynecologic Oncology Group protocol 262 randomized controlled trial.

    PubMed

    Seagle, Brandon-Luke L; Shahabi, Shohreh

    2017-04-01

    To determine the cost-effectiveness of dose-dense versus standard intravenous adjuvant chemotherapy for ovarian cancer using results from the no-bevacizumab cohort of the Gynecologic Oncology Group protocol 262 (GOG-262) randomized controlled trial, which reported a smaller absolute progression-free survival (PFS) benefit than the prior Japanese trial. A three-state Markov decision model from a healthcare system perspective with a 21day cycle length and 28month time-horizon was used to calculate incremental cost-effectiveness ratio (ICER) values per progression-free life-year saved (PFLYS) using results from GOG-262. Costs of chemotherapy, complications, and surveillance were from Medicare or institutional data. PFS, discontinuation, and complication rates were from GOG-262. Time-dependent transition probabilities and within-cycle corrections were used. One-way and probabilistic sensitivity analyses were performed. The model produces standard and dose-dense cohorts with 84.3% and 68.3% progression event proportions at 28months, matching GOG-262 rates at the trial's median follow-up. With a median PFS of 10.3months after standard chemotherapy and a hazard ratio for progression of 0.62 after dose-dense therapy, the ICER for dose-dense chemotherapy is $8074.25 (95% confidence interval: $7615.97-$10,207.16) per PFLYS. ICER estimates are sensitive only to the hazard ratio estimate but do not exceed $100,000 per PFLYS. 99.8% of ICER estimates met a more stringent willingness-to-pay of $50,000 per PFLYS. The willingness-to-pay value at which there is a 90% probability of dose-dense treatment being cost-effective is $12,000 per PFLYS. Dose-dense adjuvant chemotherapy is robustly cost-effective for advanced ovarian cancer from a healthcare system perspective based on results from GOG-262. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Lung Size and the Risk of Radiation Pneumonitis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briere, Tina Marie, E-mail: tmbriere@mdanderson.org; Krafft, Shane; Liao, Zhongxing

    2016-02-01

    Purpose: The purpose of this study was to identify patient populations treated for non-small cell lung cancer (NSCLC) who may be more at risk of radiation pneumonitis. Methods and Materials: A total of 579 patients receiving fractionated 3D conformal or intensity modulated radiation therapy (IMRT) for NSCLC were included in the study. Statistical analysis was performed to search for cohorts of patients with higher incidences of radiation pneumonitis. In addition to conventional risk factors, total and spared lung volumes were analyzed. The Lyman-Kutcher-Burman (LKB) and cure models were then used to fit the incidence of radiation pneumonitis as a functionmore » of lung dose and other factors. Results: Total lung volumes with a sparing of less than 1854 cc at 40 Gy were associated with a significantly higher incidence of radiation pneumonitis at 6 months (38% vs 12% for patients with larger volumes, P<.001). This patient cohort was overwhelmingly female and represented 22% of the total female population of patients and nearly 30% of the cases of radiation pneumonitis. An LKB fit to normal tissue complication probability (NTCP) including volume as a dose modifying factor resulted in a dose that results in a 50% probability of complication for the smaller spared volume cohort that was 9 Gy lower than the fit to all mean lung dose data and improved the ability to predict radiation pneumonitis (P<.001). Using an effective dose parameter of n=0.42 instead of mean lung dose further improved the LKB fit. Fits to the data using the cure model produced similar results. Conclusions: Spared lung volume should be considered when treating NSCLC patients. Separate dose constraints based on smaller spared lung volume should be considered. Smaller spared lung volume patients should be followed closely for signs of radiation pneumonitis.« less

  14. Theoretical Benefits of Dynamic Collimation in Pencil Beam Scanning Proton Therapy for Brain Tumors: Dosimetric and Radiobiological Metrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moignier, Alexandra, E-mail: alexandra-moignier@uiowa.edu; Gelover, Edgar; Wang, Dongxu

    Purpose: To quantify the dosimetric benefit of using a dynamic collimation system (DCS) for penumbra reduction during the treatment of brain tumors by pencil beam scanning proton therapy (PBS PT). Methods and Materials: Collimated and uncollimated brain treatment plans were created for 5 patients previously treated with PBS PT and retrospectively enrolled in an institutional review board–approved study. The in-house treatment planning system, RDX, was used to generate the plans because it is capable of modeling both collimated and uncollimated beamlets. The clinically delivered plans were reproduced with uncollimated plans in terms of target coverage and organ at risk (OAR) sparingmore » to ensure a clinically relevant starting point, and collimated plans were generated to improve the OAR sparing while maintaining target coverage. Physical and biological comparison metrics, such as dose distribution conformity, mean and maximum doses, normal tissue complication probability, and risk of secondary brain cancer, were used to evaluate the plans. Results: The DCS systematically improved the dose distribution conformity while preserving the target coverage. The average reduction of the mean dose to the 10-mm ring surrounding the target and the healthy brain were 13.7% (95% confidence interval [CI] 11.6%-15.7%; P<.0001) and 25.1% (95% CI 16.8%-33.4%; P<.001), respectively. This yielded an average reduction of 24.8% (95% CI 0.8%-48.8%; P<.05) for the brain necrosis normal tissue complication probability using the Flickinger model, and 25.1% (95% CI 16.8%-33.4%; P<.001) for the risk of secondary brain cancer. A general improvement of the OAR sparing was also observed. Conclusion: The lateral penumbra reduction afforded by the DCS increases the normal tissue sparing capabilities of PBS PT for brain cancer treatment while preserving target coverage.« less

  15. Introduction to symposium on unmeasured heterogeneity in school transition models

    PubMed Central

    Mare, Robert D.

    2014-01-01

    Researchers have used models of school transitions for over 30 years to describe inequality of educational opportunity and have contributed a number of important refinements and extensions. School transition models have the complication that the estimated effects of family background on the probability of continuing in school are affected by differential attrition on unobserved factors at earlier stages of schooling. The articles in this symposium present a variety of useful approaches to unobserved heterogeneity in school transition models. Investigators who use these approaches should attend to several issues: (1) models for school transitions may be used both descriptively (and are not therefore subject to any well-defined “bias”) and as tools for causal inference. (2) The concept of bias presupposes an underlying experiment, structural model, or population model that would, in principle, define the corresponding unbiased parameters – yet these underlying models are difficult to specify for school transition models. (3) Unobserved determinants of whether individuals make school transitions may be both exogenous and endogenous with respect to the observed regressors in the model. Without a model of how unobserved heterogeneity arises, attempted “corrections” for unmeasured heterogeneity may yield misleading estimates of the effects of measured determinants of school continuation. PMID:24882915

  16. Introduction to symposium on unmeasured heterogeneity in school transition models.

    PubMed

    Mare, Robert D

    2011-09-01

    Researchers have used models of school transitions for over 30 years to describe inequality of educational opportunity and have contributed a number of important refinements and extensions. School transition models have the complication that the estimated effects of family background on the probability of continuing in school are affected by differential attrition on unobserved factors at earlier stages of schooling. The articles in this symposium present a variety of useful approaches to unobserved heterogeneity in school transition models. Investigators who use these approaches should attend to several issues: (1) models for school transitions may be used both descriptively (and are not therefore subject to any well-defined "bias") and as tools for causal inference. (2) The concept of bias presupposes an underlying experiment, structural model, or population model that would, in principle, define the corresponding unbiased parameters - yet these underlying models are difficult to specify for school transition models. (3) Unobserved determinants of whether individuals make school transitions may be both exogenous and endogenous with respect to the observed regressors in the model. Without a model of how unobserved heterogeneity arises, attempted "corrections" for unmeasured heterogeneity may yield misleading estimates of the effects of measured determinants of school continuation.

  17. Understanding the relationship between the Centers for Medicare and Medicaid Services' Hospital Compare star rating, surgical case volume, and short-term outcomes after major cancer surgery.

    PubMed

    Kaye, Deborah R; Norton, Edward C; Ellimoottil, Chad; Ye, Zaojun; Dupree, James M; Herrel, Lindsey A; Miller, David C

    2017-11-01

    Both the Centers for Medicare and Medicaid Services' (CMS) Hospital Compare star rating and surgical case volume have been publicized as metrics that can help patients to identify high-quality hospitals for complex care such as cancer surgery. The current study evaluates the relationship between the CMS' star rating, surgical volume, and short-term outcomes after major cancer surgery. National Medicare data were used to evaluate the relationship between hospital star ratings and cancer surgery volume quintiles. Then, multilevel logistic regression models were fit to examine the association between cancer surgery outcomes and both star rankings and surgical volumes. Lastly, a graphical approach was used to compare how well star ratings and surgical volume predicted cancer surgery outcomes. This study identified 365,752 patients undergoing major cancer surgery for 1 of 9 cancer types at 2,550 hospitals. Star rating was not associated with surgical volume (P < .001). However, both the star rating and surgical volume were correlated with 4 short-term cancer surgery outcomes (mortality, complication rate, readmissions, and prolonged length of stay). The adjusted predicted probabilities for 5- and 1-star hospitals were 2.3% and 4.5% for mortality, 39% and 48% for complications, 10% and 15% for readmissions, and 8% and 16% for a prolonged length of stay, respectively. The adjusted predicted probabilities for hospitals with the highest and lowest quintile cancer surgery volumes were 2.7% and 5.8% for mortality, 41% and 55% for complications, 12.2% and 11.6% for readmissions, and 9.4% and 13% for a prolonged length of stay, respectively. Furthermore, surgical volume and the star rating were similarly associated with mortality and complications, whereas the star rating was more highly associated with readmissions and prolonged length of stay. In the absence of other information, these findings suggest that the star rating may be useful to patients when they are selecting a hospital for major cancer surgery. However, more research is needed before these ratings can supplant surgical volume as a measure of surgical quality. Cancer 2017;123:4259-4267. © 2017 American Cancer Society. © 2017 American Cancer Society.

  18. Estimating black bear density using DNA data from hair snares

    USGS Publications Warehouse

    Gardner, B.; Royle, J. Andrew; Wegan, M.T.; Rainbolt, R.E.; Curtis, P.D.

    2010-01-01

    DNA-based mark-recapture has become a methodological cornerstone of research focused on bear species. The objective of such studies is often to estimate population size; however, doing so is frequently complicated by movement of individual bears. Movement affects the probability of detection and the assumption of closure of the population required in most models. To mitigate the bias caused by movement of individuals, population size and density estimates are often adjusted using ad hoc methods, including buffering the minimum polygon of the trapping array. We used a hierarchical, spatial capturerecapture model that contains explicit components for the spatial-point process that governs the distribution of individuals and their exposure to (via movement), and detection by, traps. We modeled detection probability as a function of each individual's distance to the trap and an indicator variable for previous capture to account for possible behavioral responses. We applied our model to a 2006 hair-snare study of a black bear (Ursus americanus) population in northern New York, USA. Based on the microsatellite marker analysis of collected hair samples, 47 individuals were identified. We estimated mean density at 0.20 bears/km2. A positive estimate of the indicator variable suggests that bears are attracted to baited sites; therefore, including a trap-dependence covariate is important when using bait to attract individuals. Bayesian analysis of the model was implemented in WinBUGS, and we provide the model specification. The model can be applied to any spatially organized trapping array (hair snares, camera traps, mist nests, etc.) to estimate density and can also account for heterogeneity and covariate information at the trap or individual level. ?? The Wildlife Society.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberije, Cary, E-mail: cary.oberije@maastro.nl; De Ruysscher, Dirk; Universitaire Ziekenhuizen Leuven, KU Leuven

    Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing andmore » validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.« less

  20. Comparison of the deep inferior epigastric perforator flap and free transverse rectus abdominis myocutaneous flap in postmastectomy reconstruction: a cost-effectiveness analysis.

    PubMed

    Thoma, Achilleas; Veltri, Karen; Khuthaila, Dana; Rockwell, Gloria; Duku, Eric

    2004-05-01

    This study compared the deep inferior epigastric perforator (DIEP) flap and the free transverse rectus abdominis myocutaneous (TRAM) flap in postmastectomy reconstruction using a cost-effectiveness analysis. A decision analytic model was used. Medical costs associated with the two techniques were estimated from the Ontario Ministry of Health Schedule of Benefits for 2002. Hospital costs were obtained from St. Joseph's Healthcare, a university teaching hospital in Hamilton, Ontario, Canada. The utilities of clinically important health states related to breast reconstruction were obtained from 32 "experts" across Canada and converted into quality-adjusted life years. The probabilities of these various clinically important health states being associated with the DIEP and free TRAM flaps were obtained after a thorough review of the literature. The DIEP flap was more costly than the free TRAM flap ($7026.47 versus $6508.29), but it provided more quality-adjusted life years than the free TRAM flap (28.88 years versus 28.53 years). The baseline incremental cost-utility ratio was $1464.30 per quality-adjusted life year, favoring adoption of the DIEP flap. Sensitivity analyses were performed by assuming that the probabilities of occurrence of hernia, abdominal bulging, total flap loss, operating room time, and hospital stay were identical with the DIEP and free TRAM techniques. By assuming that the probability of postoperative hernia for the DIEP flap increased from 0.008 to 0.054 (same as for TRAM flap), the incremental cost-utility ratio changed to $1435.00 per quality-adjusted life year. A sensitivity analysis was performed for the complication of hernia because the DIEP flap allegedly diminishes this complication. Increasing the probability of abdominal bulge from 0.041 to 0.103 for the DIEP flap changed the ratio to $2731.78 per quality-adjusted life year. When the probability of total flap failure was increased from 0.014 to 0.016, the ratio changed to $1384.01 per quality-adjusted life year. When the time in the operating room was assumed to be the same for both flaps, the ratio changed to $4026.57 per quality-adjusted life year. If the hospital stay was assumed to be the same for both flaps, the ratio changed to $1944.30 per quality-adjusted life year. On the basis of the baseline calculation and sensitivity analyses, the DIEP flap remained a cost-effective procedure. Thus, adoption of this new technique for postmastectomy reconstruction is warranted in the Canadian health care system.

  1. INCLUDING TRANSITION PROBABILITIES IN NEST SURVIVAL ESTIMATION: A MAYFIELD MARKOV CHAIN

    EPA Science Inventory

    This manuscript is primarily an exploration of the statistical properties of nest-survival estimates for terrestrial songbirds. The Mayfield formulation described herein should allow researchers to test for complicated effects of stressors on daily survival and overall success, i...

  2. The computer simulation of automobile use patterns for defining battery requirements for electric cars

    NASA Technical Reports Server (NTRS)

    Schwartz, H.-J.

    1976-01-01

    The modeling process of a complex system, based on the calculation and optimization of the system parameters, is complicated in that some parameters can be expressed only as probability distributions. In the present paper, a Monte Carlo technique was used to determine the daily range requirements of an electric road vehicle in the United States from probability distributions of trip lengths, frequencies, and average annual mileage data. The analysis shows that a daily range of 82 miles meets to 95% of the car-owner requirements at all times with the exception of long vacation trips. Further, it is shown that the requirement of a daily range of 82 miles can be met by a (intermediate-level) battery technology characterized by an energy density of 30 to 50 Watt-hours per pound. Candidate batteries in this class are nickel-zinc, nickel-iron, and iron-air. These results imply that long-term research goals for battery systems should be focused on lower cost and longer service life, rather than on higher energy densities

  3. Analysis of Electronic Densities and Integrated Doses in Multiform Glioblastomas Stereotactic Radiotherapy

    NASA Astrophysics Data System (ADS)

    Barón-Aznar, C.; Moreno-Jiménez, S.; Celis, M. A.; Lárraga-Gutiérrez, J. M.; Ballesteros-Zebadúa, P.

    2008-08-01

    Integrated dose is the total energy delivered in a radiotherapy target. This physical parameter could be a predictor for complications such as brain edema and radionecrosis after stereotactic radiotherapy treatments for brain tumors. Integrated Dose depends on the tissue density and volume. Using CT patients images from the National Institute of Neurology and Neurosurgery and BrainScansoftware, this work presents the mean density of 21 multiform glioblastomas, comparative results for normal tissue and estimated integrated dose for each case. The relationship between integrated dose and the probability of complications is discussed.

  4. Application of thoracic endovascular aortic repair (TEVAR) in treating dwarfism with Stanford B aortic dissection: A case report.

    PubMed

    Qiu, Jian; Cai, Wenwu; Shu, Chang; Li, Ming; Xiong, Qinggen; Li, Quanming; Li, Xin

    2018-04-01

    To apply thoracic endovascular aortic repair (TEVAR) to treat dwarfism complicated with Stanford B aortic dissection. In this report, we presented a 63-year-old male patient of dwarfism complicated with Stanford B aortic dissection successfully treated with TEVAR. He was diagnosed with dwarfism complicated with Stanford B aortic dissection. After conservative treatment, the male patient underwent TEVAR at 1 week after hospitalization. After operation, he presented with numbness and weakness of his bilateral lower extremities, and these symptoms were significantly mitigated after effective treatment. At 1- and 3-week after TEVAR, the aorta status was maintained stable and restored. The patient obtained favorable clinical prognosis and was smoothly discharged. During subsequent follow-up, he remained physically stable. TEVAR is probably an option for treating dwarfism complicated with Stanford B aortic dissection, which remains to be validated by subsequent studies with larger sample size.

  5. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    PubMed

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  6. [Graft-versus-host disease, a rare complication of lung transplantation].

    PubMed

    Morisse-Pradier, H; Nove-Josserand, R; Philit, F; Senechal, A; Berger, F; Callet-Bauchu, E; Traverse-Glehen, A; Maury, J-M; Grima, R; Tronc, F; Mornex, J-F

    2016-02-01

    Graft-versus-host disease (GVHD) is a classic and frequent multisystemic complication of bone marrow allografts. It has also been reported after the transplantation of solid organs such as the liver or gut. Recent cases of GVHD have been reported after lung and heart-lung transplant. Skin, liver, gastrointestinal tract and bone marrow are the organ preferentially affected by GVHD. Corticosteroid is the first line treatment of GVHD. The prognosis reported in solid organ transplants is poor with infectious complications favoured by immunosuppressive therapy. In this article, we report a case of a patient with cystic fibrosis who presented a probable GVHD 18 months after a lung transplant and a literature review of similar cases. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  7. Serotonin 5-HT1A agonist improves motor complications in rodent and primate parkinsonian models.

    PubMed

    Bibbiani, F; Oh, J D; Chase, T N

    2001-11-27

    Serotoninergic transmission in the basal ganglia is known to influence dopaminergic mechanisms and motor function. To evaluate the possibility that serotoninergic 5-HT1A autoreceptors (by regulating the release of serotonin as well as dopamine formed from exogenous levodopa) affect the response alterations complicating levodopa treatment of PD. The 5-HT1A receptor agonist sarizotan (EMD128130) was systemically administered alone and together with levodopa to parkinsonian rats and nonhuman primates. In 6-hydroxydopamine-lesioned rats, sarizotan (2.5 mg/kg PO) had no effect on the acute rotational response to levodopa but did attenuate the shortening in motor response duration induced by chronic levodopa treatment. In 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine-lesioned monkeys, sarizotan (2 mg/kg PO) alone had no effect on parkinsonian severity or on the antiparkinsonian response to levodopa. In contrast, the same dose of sarizotan reduced levodopa-induced choreiform dyskinesias by 91 +/- 5.9%. In both species, the motoric effects of sarizotan were blocked by the selective 5-HT1A antagonist WAY100635 (0.1 mg/kg SC), indicating that the observed sarizotan responses were probably mediated at the 5-HT1A autoreceptor. Pharmaceuticals acting to stimulate 5-HT1A receptors could prove useful in the treatment of the motor response complications in parkinsonian patients.

  8. Cost-effectiveness v patient preference in the choice of treatment for distal ureteral calculi: a literature-based decision analysis.

    PubMed

    Wolf, J S; Carroll, P R; Stoller, M L

    1995-06-01

    Ureteroscopy (URS) and extracorporeal shockwave lithotripsy (SWL) battle for supremacy in the management distal ureteral calculi. In order to clarify issues surrounding this controversy, we created a decision tree modeling URS or SWL with literature-based probabilities and used as endpoints both cost and patient preferences. Ureteroscopy was more successful than single-session or multiple-session SWL, 92.1% v 74.3% or 84.5%, and had a lower retreatment/complication rate. Although initial SWL was only slightly more expensive than URS, $4,420 v $4,337, the difference increased when the additional costs of complications and retreatment were calculated, $6,745 v $5,555. Using values for an "average" patient, SWL was preferred to URS in terms of patient satisfaction. The most important factors distinguishing between URS and SWL were the success of treatment, the cost of initial therapy, and patient attitudes toward unplanned ancillary procedures and retreatment. Although no alteration of success rates and cost figures within reasonable ranges made URS less cost-effective than SWL, individual differences in patients' aversion for complications allowed URS to be preferred to SWL in some situations. Therefore, SWL is less cost-effective than URS and is not necessarily preferred by patients. The physician should be aware of the principal determinants of the choice between URS and SWL treatment of distal ureteral calculi.

  9. Metachronous Lung Cancer: Clinical Characteristics and Effects of Surgical Treatment.

    PubMed

    Rzechonek, Adam; Błasiak, Piotr; Muszczyńska-Bernhard, Beata; Pawełczyk, Konrad; Pniewski, Grzegorz; Ornat, Maciej; Grzegrzółka, Jędrzej; Brzecka, Anna

    2018-01-01

    The occurrence of a second lung tumor after surgical removal of lung cancer usually indicates a lung cancer metastasis, but sometimes a new lesion proves to be a new primary lung cancer, i.e., metachronous lung cancer. The goal of the present study was to conduct a clinical evaluation of patients with metachronous lung cancer and lung cancer metastasis, and to compare the early and distant outcomes of surgical treatment in both cancer types. There were 26 age-matched patients with lung cancer metastases and 23 patients with metachronous lung cancers, who underwent a second lung cancer resection. We evaluated the histological type of a resected cancer, the extent of thoracosurgery, the frequency of early postoperative complications, and the probability of 5-year survival after the second operation. The findings were that metachronous lung cancer was adenocarcinoma in 52% of patients, with a different histopathological pattern from that of the primary lung cancer in 74% of patients. In both cancer groups, mechanical resections were the most common surgery type (76% of all cases), with anatomical resections such as segmentectomy, lobectomy, or pneumectomy being much rarer conducted. The incidence of early postoperative complications in metachronous lung cancer and lung cancer metastasis (30% vs. 31%, respectively) and the probability of 5-year survival after resection of either cancer tumor (60.7% vs. 50.9%, respectively) were comparable. In conclusion, patients undergoing primary lung cancer surgery require a long-term follow-up due to the risk of metastatic or metachronous lung cancer. The likelihood of metachronous lung cancer and pulmonary lung cancer metastases, the incidence of postoperative complications, and the probability of 5-year survival after resection of metachronous lung cancer or lung cancer metastasis are similar.

  10. Quantifying Unnecessary Normal Tissue Complication Risks due to Suboptimal Planning: A Secondary Study of RTOG 0126

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Schmidt, Rachel; Moiseenko, Vitali

    Purpose: The purpose of this study was to quantify the frequency and clinical severity of quality deficiencies in intensity modulated radiation therapy (IMRT) planning in the Radiation Therapy Oncology Group 0126 protocol. Methods and Materials: A total of 219 IMRT patients from the high-dose arm (79.2 Gy) of RTOG 0126 were analyzed. To quantify plan quality, we used established knowledge-based methods for patient-specific dose-volume histogram (DVH) prediction of organs at risk and a Lyman-Kutcher-Burman (LKB) model for grade ≥2 rectal complications to convert DVHs into normal tissue complication probabilities (NTCPs). The LKB model was validated by fitting dose-response parameters relative tomore » observed toxicities. The 90th percentile (22 of 219) of plans with the lowest excess risk (difference between clinical and model-predicted NTCP) were used to create a model for the presumed best practices in the protocol (pDVH{sub 0126,top10%}). Applying the resultant model to the entire sample enabled comparisons between DVHs that patients could have received to DVHs they actually received. Excess risk quantified the clinical impact of suboptimal planning. Accuracy of pDVH predictions was validated by replanning 30 of 219 patients (13.7%), including equal numbers of presumed “high-quality,” “low-quality,” and randomly sampled plans. NTCP-predicted toxicities were compared to adverse events on protocol. Results: Existing models showed that bladder-sparing variations were less prevalent than rectum quality variations and that increased rectal sparing was not correlated with target metrics (dose received by 98% and 2% of the PTV, respectively). Observed toxicities were consistent with current LKB parameters. Converting DVH and pDVH{sub 0126,top10%} to rectal NTCPs, we observed 94 of 219 patients (42.9%) with ≥5% excess risk, 20 of 219 patients (9.1%) with ≥10% excess risk, and 2 of 219 patients (0.9%) with ≥15% excess risk. Replanning demonstrated the predicted NTCP reductions while maintaining the volume of the PTV receiving prescription dose. An equivalent sample of high-quality plans showed fewer toxicities than low-quality plans, 6 of 73 versus 10 of 73 respectively, although these differences were not significant (P=.21) due to insufficient statistical power in this retrospective study. Conclusions: Plan quality deficiencies in RTOG 0126 exposed patients to substantial excess risk for rectal complications.« less

  11. Simulation of stochastic diffusion via first exit times

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lötstedt, Per, E-mail: perl@it.uu.se; Meinecke, Lina, E-mail: lina.meinecke@it.uu.se

    2015-11-01

    In molecular biology it is of interest to simulate diffusion stochastically. In the mesoscopic model we partition a biological cell into unstructured subvolumes. In each subvolume the number of molecules is recorded at each time step and molecules can jump between neighboring subvolumes to model diffusion. The jump rates can be computed by discretizing the diffusion equation on that unstructured mesh. If the mesh is of poor quality, due to a complicated cell geometry, standard discretization methods can generate negative jump coefficients, which no longer allows the interpretation as the probability to jump between the subvolumes. We propose a methodmore » based on the mean first exit time of a molecule from a subvolume, which guarantees positive jump coefficients. Two approaches to exit times, a global and a local one, are presented and tested in simulations on meshes of different quality in two and three dimensions.« less

  12. An intelligent knowledge mining model for kidney cancer using rough set theory.

    PubMed

    Durai, M A Saleem; Acharjya, D P; Kannan, A; Iyengar, N Ch Sriman Narayana

    2012-01-01

    Medical diagnosis processes vary in the degree to which they attempt to deal with different complicating aspects of diagnosis such as relative importance of symptoms, varied symptom pattern and the relation between diseases themselves. Rough set approach has two major advantages over the other methods. First, it can handle different types of data such as categorical, numerical etc. Secondly, it does not make any assumption like probability distribution function in stochastic modeling or membership grade function in fuzzy set theory. It involves pattern recognition through logical computational rules rather than approximating them through smooth mathematical functional forms. In this paper we use rough set theory as a data mining tool to derive useful patterns and rules for kidney cancer faulty diagnosis. In particular, the historical data of twenty five research hospitals and medical college is used for validation and the results show the practical viability of the proposed approach.

  13. Simulation of stochastic diffusion via first exit times

    PubMed Central

    Lötstedt, Per; Meinecke, Lina

    2015-01-01

    In molecular biology it is of interest to simulate diffusion stochastically. In the mesoscopic model we partition a biological cell into unstructured subvolumes. In each subvolume the number of molecules is recorded at each time step and molecules can jump between neighboring subvolumes to model diffusion. The jump rates can be computed by discretizing the diffusion equation on that unstructured mesh. If the mesh is of poor quality, due to a complicated cell geometry, standard discretization methods can generate negative jump coefficients, which no longer allows the interpretation as the probability to jump between the subvolumes. We propose a method based on the mean first exit time of a molecule from a subvolume, which guarantees positive jump coefficients. Two approaches to exit times, a global and a local one, are presented and tested in simulations on meshes of different quality in two and three dimensions. PMID:26600600

  14. Volcanic ash melting under conditions relevant to ash turbine interactions.

    PubMed

    Song, Wenjia; Lavallée, Yan; Hess, Kai-Uwe; Kueppers, Ulrich; Cimarelli, Corrado; Dingwell, Donald B

    2016-03-02

    The ingestion of volcanic ash by jet engines is widely recognized as a potentially fatal hazard for aircraft operation. The high temperatures (1,200-2,000 °C) typical of jet engines exacerbate the impact of ash by provoking its melting and sticking to turbine parts. Estimation of this potential hazard is complicated by the fact that chemical composition, which affects the temperature at which volcanic ash becomes liquid, can vary widely amongst volcanoes. Here, based on experiments, we parameterize ash behaviour and develop a model to predict melting and sticking conditions for its global compositional range. The results of our experiments confirm that the common use of sand or dust proxy is wholly inadequate for the prediction of the behaviour of volcanic ash, leading to overestimates of sticking temperature and thus severe underestimates of the thermal hazard. Our model can be used to assess the deposition probability of volcanic ash in jet engines.

  15. Spatial and temporal variability in the R-5 infiltration data set: Déjà vu and rainfall-runoff simulations

    NASA Astrophysics Data System (ADS)

    Loague, Keith; Kyriakidis, Phaedon C.

    1997-12-01

    This paper is a continuation of the event-based rainfall-runoff model evaluation study reported by Loague and Freeze [1985[. Here we reevaluate the performance of a quasi-physically based rainfall-runoff model for three large events from the well-known R-5 catchment. Five different statistical criteria are used to quantitatively judge model performance. Temporal variability in the large R-5 infiltration data set [Loague and Gander, 1990] is filtered by working in terms of permeability. The transformed data set is reanalyzed via geostatistical methods to model the spatial distribution of permeability across the R-5 catchment. We present new estimates of the spatial distribution of infiltration that are in turn used in our rainfall-runoff simulations with the Horton rainfall-runoff model. The new rainfall-runoff simulations, complicated by reinfiltration impacts at the smaller scales of characterization, indicate that the near-surface hydrologic response of the R-5 catchment is most probably dominated by a combination of the Horton and Dunne overland flow mechanisms.

  16. Evaluation of financial burden following complications after major surgery in France: Potential return after perioperative goal-directed therapy.

    PubMed

    Landais, Alain; Morel, Morgane; Goldstein, Jacques; Loriau, Jerôme; Fresnel, Annie; Chevalier, Corinne; Rejasse, Gilles; Alfonsi, Pascal; Ecoffey, Claude

    2017-06-01

    Perioperative goal-directed therapy (PGDT) has been demonstrated to improve postoperative outcomes and reduce the length of hospital stays. The objective of our analysis was to evaluate the cost of complications, derived from French hospital payments, and calculate the potential cost savings and length of hospital stay reductions. The billing of 2388 patients who underwent scheduled high-risk surgery (i.e. major abdominal, gynaecologic, urological, vascular, and orthopaedic interventions) over three years was retrospectively collected from three French hospitals (one public-teaching, one public, and one private hospital). A relationship between mortality, length of hospital stays, cost/patient, and severity scores, based mainly on postoperative complications but also on preoperative clinical status, were analysed. Statistical analysis was performed using Student's t-tests or Wilcoxon tests. Our analyses determined that a severity score of 3 or 4 was associated with complications in 90% of cases and this represented 36% of patients who, compared with those with a score of 1 or 2, were associated with significantly increased costs (€ 8205±3335 to € 22,081±16,090; P<0.001, delta of € 13,876) and a prolonged length of hospital stay (mean of 10 to 27 days; P<0.001, delta of 17 days). According to estimates for complications avoided by PGDT, there was a projected reduction in average healthcare costs of between € 854 and € 1458 per patient and a reduction in total hospital bed days from 1755 to 4423 over three years. Based on French National data (47,000 high risk surgeries per year), the potential financial savings ranged from € 40M to € 68M, not including the costs of PGDT and its implementation. Our analysis demonstrates that patients with complications are significantly more expensive to care for than those without complications. In our model, it was projected that implementing PGDT during high-risk surgery may significantly reduce healthcare costs and the length of hospital stays in France while probably improving patient access to care and reducing waiting times for procedures. Copyright © 2017. Published by Elsevier Masson SAS.

  17. Early endoscopic ultrasonography in acute biliary pancreatitis: A prospective pilot study

    PubMed Central

    Anderloni, Andrea; Galeazzi, Marianna; Ballarè, Marco; Pagliarulo, Michela; Orsello, Marco; Del Piano, Mario; Repici, Alessandro

    2015-01-01

    AIM: To investigate the clinical usefulness of early endoscopic ultrasonography (EUS) in the management of acute biliary pancreatitis (ABP). METHODS: All consecutive patients entering the emergency department between January 2010 and December 2012 due to acute abdominal pain and showing biochemical and/or radiological findings consistent with possible ABP were prospectively enrolled. Patients were classified as having a low, moderate, or high probability of common bile duct (CBD) stones, according to the established risk stratification. Exclusion criteria were: gastrectomy or patient in whom the cause of biliary obstruction was already identified by ultrasonography. All enrolled patients underwent EUS within 48 h of their admission. Endoscopic retrograde cholangiopancreatography was performed immediately after EUS only in those cases with proven CBD stones or sludge. The following parameters were investigated: (1) clinical: age, sex, fever; (2) radiological: dilated CBD; and (3) biochemical: bilirubin, AST, ALT, gGT, ALP, amylase, lipasis, PCR. Association between presence of CBD stone at EUS and the individual predictors were assessed by univariate logistic regression. Predictors significantly associated with CBD stones (P < 0.05) were entered in a multivariate logistic regression model. RESULTS: A total of 181 patients with pancreatitis were admitted to the emergency department between January 2010 and December 2012. After exclusion criteria a total of 71 patients (38 females, 53.5%, mean age 58 ± 20.12 years, range 27-89 years; 33 males, 46.5%, mean age 65 ± 11.86 years, range 41-91 years) were included in the present study. The probability of CBD stones was considered low in 21 cases (29%), moderate in 26 (37%), and high in the remaining 24 (34%). The 71 patients included in the study underwent EUS, which allowed for a complete evaluation of the target sites in all the cases. The procedure was completed in a mean time of 14.7 min (range 9-34 min), without any notable complications.The overall CBD stone frequency was 44% (31 of 71), with a significant increase from the group at low pretest probability to that at moderate (OR = 5.79, P = 0.01) and high (OR = 4.25, P = 0.03) pretest probability. CONCLUSION: Early EUS in ABP allows, if appropriate, immediate endoscopic treatment and significant spare of unnecessary operative procedures thus reducing possible related complications. PMID:26420969

  18. Diagnosis of Exclusion: A Case Report of Probable Glatiramer Acetate-Induced Eosinophilic Myocarditis

    PubMed Central

    Michaud, Christopher J.; Bockheim, Heather M.; Daum, Timothy E.

    2014-01-01

    Importance. Medication-induced eosinophilia is an acknowledged, often self-limiting occurrence. Glatiramer acetate, a biologic injection used in the management of relapsing-remitting multiple sclerosis, is widely regarded as a safe and effective medication and lists eosinophilia as an infrequent side effect in its package insert. Contrary to reports of transient, benign drug-induced eosinophilia, we describe a case of probable glatiramer acetate-induced eosinophilia that ultimately culminated in respiratory distress, shock, and eosinophilic myocarditis. Observations. A 59-year-old female was admitted to the hospital after routine outpatient labs revealed leukocytosis (43,000 cells/mm3) with pronounced hypereosinophilia (63%). This patient had been using glatiramer acetate without complication for over 10 years prior to admission. Leukocytosis and hypereosinophilia persisted as a myriad of diagnostic evaluations returned negative, ultimately leading to respiratory depression, shock, and myocarditis. Glatiramer acetate was held for the first time on day 6 of the hospital stay with subsequent resolution of leukocytosis, hypereosinophilia, respiratory distress, and shock. Conclusions and Relevance. Glatiramer acetate was probably the cause of this observed hypereosinophilia and the resulting complications. Reports of glatiramer-induced eosinophilia are rare, and few case reports regarding medication-induced hypereosinophilia describe the severe systemic manifestations seen in this patient. PMID:25105037

  19. Diagnosis of exclusion: a case report of probable glatiramer acetate-induced eosinophilic myocarditis.

    PubMed

    Michaud, Christopher J; Bockheim, Heather M; Nabeel, Muhammad; Daum, Timothy E

    2014-01-01

    Importance. Medication-induced eosinophilia is an acknowledged, often self-limiting occurrence. Glatiramer acetate, a biologic injection used in the management of relapsing-remitting multiple sclerosis, is widely regarded as a safe and effective medication and lists eosinophilia as an infrequent side effect in its package insert. Contrary to reports of transient, benign drug-induced eosinophilia, we describe a case of probable glatiramer acetate-induced eosinophilia that ultimately culminated in respiratory distress, shock, and eosinophilic myocarditis. Observations. A 59-year-old female was admitted to the hospital after routine outpatient labs revealed leukocytosis (43,000 cells/mm(3)) with pronounced hypereosinophilia (63%). This patient had been using glatiramer acetate without complication for over 10 years prior to admission. Leukocytosis and hypereosinophilia persisted as a myriad of diagnostic evaluations returned negative, ultimately leading to respiratory depression, shock, and myocarditis. Glatiramer acetate was held for the first time on day 6 of the hospital stay with subsequent resolution of leukocytosis, hypereosinophilia, respiratory distress, and shock. Conclusions and Relevance. Glatiramer acetate was probably the cause of this observed hypereosinophilia and the resulting complications. Reports of glatiramer-induced eosinophilia are rare, and few case reports regarding medication-induced hypereosinophilia describe the severe systemic manifestations seen in this patient.

  20. Management issues for women with epilepsy-Focus on pregnancy (an evidence-based review): I. Obstetrical complications and change in seizure frequency: Report of the Quality Standards Subcommittee and Therapeutics and Technology Assessment Subcommittee of the American Academy of Neurology and the American Epilepsy Society.

    PubMed

    Harden, Cynthia L; Hopp, Jennifer; Ting, Tricia Y; Pennell, Page B; French, Jacqueline A; Allen Hauser, W; Wiebe, Samuel; Gronseth, Gary S; Thurman, David; Meador, Kimford J; Koppel, Barbara S; Kaplan, Peter W; Robinson, Julian N; Gidal, Barry; Hovinga, Collin A; Wilner, Andrew N; Vazquez, Blanca; Holmes, Lewis; Krumholz, Allan; Finnell, Richard; Le Guen, Claire

    2009-05-01

    A committee assembled by the American Academy of Neurology (AAN) reassessed the evidence related to the care of women with epilepsy (WWE) during pregnancy, including the risk of pregnancy complications or other medical problems during pregnancy, change in seizure frequency, the risk of status epilepticus, and the rate of remaining seizure-free during pregnancy. The committee evaluated the available evidence according to a structured literature review and classification of relevant articles. For WWE who are taking antiepileptic drugs (AEDs), there is probably no substantially increased risk (>2 times expected) of cesarean delivery or late pregnancy bleeding, and probably no moderately increased risk (>1.5 times expected) of premature contractions or premature labor and delivery. There is possibly a substantially increased risk of premature contractions and premature labor and delivery during pregnancy for WWE who smoke. WWE should be counseled that seizure freedom for at least 9 months prior to pregnancy is probably associated with a high likelihood (84-92%) of remaining seizure-free during pregnancy. WWE who smoke should be counseled that they possibly have a substantially increased risk of premature contractions and premature labor and delivery.

  1. Primary urethral reconstruction: the cost minimized approach to the bulbous urethral stricture.

    PubMed

    Rourke, Keith F; Jordan, Gerald H

    2005-04-01

    Treatment for urethral stricture disease often requires a choice between readily available direct vision internal urethrotomy (DVIU) and highly efficacious but more technically complex open urethral reconstruction. Using the short segment bulbous urethral stricture as a model, we determined which strategy is less costly. The costs of DVIU and open urethral reconstruction with stricture excision and primary anastomosis for a 2 cm bulbous urethral stricture were compared using a cost minimization decision analysis model. Clinical probability estimates for the DVIU treatment arm were the risk of bleeding, urinary tract infection and the risk of stricture recurrence. Estimates for the primary urethral reconstruction strategy were the risk of wound complications, complications of exaggerated lithotomy and the risk of treatment failure. Direct third party payer costs were determined in 2002 United States dollars. The model predicted that treatment with DVIU was more costly (17,747 dollars per patient) than immediate open urethral reconstruction (16,444 dollars per patient). This yielded an incremental cost savings of $1,304 per patient, favoring urethral reconstruction. Sensitivity analysis revealed that primary treatment with urethroplasty was economically advantageous within the range of clinically relevant events. Treatment with DVIU became more favorable when the long-term risk of stricture recurrence after DVIU was less than 60%. Treatment for short segment bulbous urethral strictures with primary reconstruction is less costly than treatment with DVIU. From a fiscal standpoint urethral reconstruction should be considered over DVIU in the majority of clinical circumstances.

  2. Statistical methods for clinical verification of dose response parameters related to esophageal stricture and AVM obliteration from radiotherapy

    NASA Astrophysics Data System (ADS)

    Mavroidis, Panayiotis; Lind, Bengt K.; Theodorou, Kyriaki; Laurell, Göran; Fernberg, Jan-Olof; Lefkopoulos, Dimitrios; Kappas, Constantin; Brahme, Anders

    2004-08-01

    The purpose of this work is to provide some statistical methods for evaluating the predictive strength of radiobiological models and the validity of dose-response parameters for tumour control and normal tissue complications. This is accomplished by associating the expected complication rates, which are calculated using different models, with the clinical follow-up records. These methods are applied to 77 patients who received radiation treatment for head and neck cancer and 85 patients who were treated for arteriovenous malformation (AVM). The three-dimensional dose distribution delivered to esophagus and AVM nidus and the clinical follow-up results were available for each patient. Dose-response parameters derived by a maximum likelihood fitting were used as a reference to evaluate their compatibility with the examined treatment methodologies. The impact of the parameter uncertainties on the dose-response curves is demonstrated. The clinical utilization of the radiobiological parameters is illustrated. The radiobiological models (relative seriality and linear Poisson) and the reference parameters are validated to prove their suitability in reproducing the treatment outcome pattern of the patient material studied (through the probability of finding a worse fit, area under the ROC curve and khgr2 test). The analysis was carried out for the upper 5 cm of the esophagus (proximal esophagus) where all the strictures are formed, and the total volume of AVM. The estimated confidence intervals of the dose-response curves appear to have a significant supporting role on their clinical implementation and use.

  3. Emergency Physician Risk Estimates and Admission Decisions for Chest Pain: A Web-Based Scenario Study.

    PubMed

    Schriger, David L; Menchine, Michael; Wiechmann, Warren; Carmelli, Guy

    2018-04-20

    We conducted this study to better understand how emergency physicians estimate risk and make admission decisions for patients with low-risk chest pain. We created a Web-based survey consisting of 5 chest pain scenarios that included history, physical examination, ECG findings, and basic laboratory studies, including a negative initial troponin-level result. We administered the scenarios in random order to emergency medicine residents and faculty at 11 US emergency medicine residency programs. We randomized respondents to receive questions about 1 of 2 endpoints, acute coronary syndrome or serious complication (death, dysrhythmia, or congestive heart failure within 30 days). For each scenario, the respondent provided a quantitative estimate of the probability of the endpoint, a qualitative estimate of the risk of the endpoint (very low, low, moderate, high, or very high), and an admission decision. Respondents also provided demographic information and completed a 3-item Fear of Malpractice scale. Two hundred eight (65%) of 320 eligible physicians completed the survey, 73% of whom were residents. Ninety-five percent of respondents were wholly consistent (no admitted patient was assigned a lower probability than a discharged patient). For individual scenarios, probability estimates covered at least 4 orders of magnitude; admission rates for scenarios varied from 16% to 99%. The majority of respondents (>72%) had admission thresholds at or below a 1% probability of acute coronary syndrome. Respondents did not fully differentiate the probability of acute coronary syndrome and serious outcome; for each scenario, estimates for the two were quite similar despite a serious outcome being far less likely. Raters used the terms "very low risk" and "low risk" only when their probability estimates were less than 1%. The majority of respondents considered any probability greater than 1% for acute coronary syndrome or serious outcome to be at least moderate risk and warranting admission. Physicians used qualitative terms in ways fundamentally different from how they are used in ordinary conversation, which may lead to miscommunication during shared decisionmaking processes. These data suggest that probability or utility models are inadequate to describe physician decisionmaking for patients with chest pain. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  4. A Computational Model for Biomechanical Effects of Arterial Compliance Mismatch

    PubMed Central

    He, Fan; Hua, Lu; Gao, Li-jian

    2015-01-01

    Background. Compliance mismatch is a negative factor and it needs to be considered in arterial bypass grafting. Objective. A computational model was employed to investigate the effects of arterial compliance mismatch on blood flow, wall stress, and deformation. Methods. The unsteady blood flow was assumed to be laminar, Newtonian, viscous, and incompressible. The vessel wall was assumed to be linear elastic, isotropic, and incompressible. The fluid-wall interaction scheme was constructed using the finite element method. Results. The results show that there are identical wall shear stress waveforms, wall stress, and strain waveforms at different locations. The comparison of the results demonstrates that wall shear stresses and wall strains are higher while wall stresses are lower at the more compliant section. The differences promote the probability of intimal thickening at some locations. Conclusions. The model is effective and gives satisfactory results. It could be extended to all kinds of arteries with complicated geometrical and material factors. PMID:27019580

  5. Comments on the present state and future directions of PDF methods

    NASA Technical Reports Server (NTRS)

    Obrien, E. E.

    1992-01-01

    The one point probability density function (PDF) method is examined in light of its use in actual engineering problems. The PDF method, although relatively complicated, appears to be the only format available to handle the nonlinear stochastic difficulties caused by typical reaction kinetics. Turbulence modeling, if it is to play a central role in combustion modeling, has to be integrated with the chemistry in a way which produces accurate numerical solutions to combustion problems. It is questionable whether the development of turbulent models in isolation from the peculiar statistics of reactant concentrations is a fruitful line of development as far as propulsion is concerned. There are three issues for which additional viewgraphs are prepared: the one point pdf method; the amplitude mapping closure; and a hybrid strategy for replacing a full two point pdf treatment of reacting flows by a single point pdf and correlation functions. An appeal is made for the establishment of an adequate data base for compressible flow with reactions for Mach numbers of unity or higher.

  6. Paraplegia after contrast media application: a transient or devastating rare complication? Case report.

    PubMed

    Mielke, Dorothee; Kallenberg, Kai; Hartmann, Marius; Rohde, Veit

    2016-05-01

    The authors report the case of a 76-year-old man with a spinal dural arteriovenous fistula. The patient suffered from sudden repeated reversible paraplegia after spinal digital subtraction angiography as well as CT angiography. Neurotoxicity of contrast media (CM) is the most probable cause for this repeated short-lasting paraplegia. Intolerance to toxicity of CM to the vulnerable spinal cord is rare, and probably depends on the individual patient. This phenomenon is transient and can occur after both intraarterial and intravenous CM application.

  7. [The vanadium compounds: chemistry, synthesis, insulinomimetic properties].

    PubMed

    Fedorova, E V; Buriakina, A V; Vorob'eva, N M; Baranova, N I

    2014-01-01

    The review considers the biological role of vanadium, its participation in various processes in humans and other mammals, and the anti-diabetic effect of its compounds. Vanadium salts have persistent hypoglycemic and antihyperlipidemic effects and reduce the probability of secondary complications in animals with experimental diabetes. The review contains a detailed description of all major synthesized vanadium complexes having antidiabetic activity. Currently, vanadium complexes with organic ligands are more effective and safer than the inorganic salts. Despite the proven efficacy of these compounds as the anti-diabetic agents in animal models, only one organic complex of vanadium is currently under the second phase of clinical trials. All of the considered data suggest that vanadium compound are a new promising class of drugs in modern pharmacotherapy of diabetes.

  8. Simulating the room-temperature dynamic motion of a ferromagnetic vortex in a bistable potential

    NASA Astrophysics Data System (ADS)

    Haber, E.; Badea, R.; Berezovsky, J.

    2018-05-01

    The ability to precisely and reliably control the dynamics of ferromagnetic (FM) vortices could lead to novel nonvolatile memory devices and logic gates. Intrinsic and fabricated defects in the FM material can pin vortices and complicate the dynamics. Here, we simulated switching a vortex between bistable pinning sites using magnetic field pulses. The dynamic motion was modeled with the Thiele equation for a massless, rigid vortex subject to room-temperature thermal noise. The dynamics were explored both when the system was at zero temperature and at room-temperature. The probability of switching for different pulses was calculated, and the major features are explained using the basins of attraction map of the two pinning sites.

  9. Fracture Probability of MEMS Optical Devices for Space Flight Applications

    NASA Technical Reports Server (NTRS)

    Fettig, Rainer K.; Kuhn, Jonathan L.; Moseley, S. Harvey; Kutyrev, Alexander S.; Orloff, Jon

    1999-01-01

    A bending fracture test specimen design is presented for thin elements used in optical devices for space flight applications. The specimen design is insensitive to load position, avoids end effect complications, and can be used to measure strength of membranes less than 2 microns thick. The theoretical equations predicting stress at failure are presented, and a detailed finite element model is developed to validate the equations for this application. An experimental procedure using a focused ion beam machine is outlined, and results from preliminary tests of 1.9 microns thick single crystal silicon are presented. These tests are placed in the context of a methodology for the design and evaluation of mission critical devices comprised of large arrays of cells.

  10. Development of a prognostic nomogram for cirrhotic patients with upper gastrointestinal bleeding.

    PubMed

    Zhou, Yu-Jie; Zheng, Ji-Na; Zhou, Yi-Fan; Han, Yi-Jing; Zou, Tian-Tian; Liu, Wen-Yue; Braddock, Martin; Shi, Ke-Qing; Wang, Xiao-Dong; Zheng, Ming-Hua

    2017-10-01

    Upper gastrointestinal bleeding (UGIB) is a complication with a high mortality rate in critically ill patients presenting with cirrhosis. Today, there exist few accurate scoring models specifically designed for mortality risk assessment in critically ill cirrhotic patients with upper gastrointestinal bleeding (CICGIB). Our aim was to develop and evaluate a novel nomogram-based model specific for CICGIB. Overall, 540 consecutive CICGIB patients were enrolled. On the basis of Cox regression analyses, the nomogram was constructed to estimate the probability of 30-day, 90-day, 270-day, and 1-year survival. An upper gastrointestinal bleeding-chronic liver failure-sequential organ failure assessment (UGIB-CLIF-SOFA) score was derived from the nomogram. Performance assessment and internal validation of the model were performed using Harrell's concordance index (C-index), calibration plot, and bootstrap sample procedures. UGIB-CLIF-SOFA was also compared with other prognostic models, such as CLIF-SOFA and model for end-stage liver disease, using C-indices. Eight independent factors derived from Cox analysis (including bilirubin, creatinine, international normalized ratio, sodium, albumin, mean artery pressure, vasopressin used, and hematocrit decrease>10%) were assembled into the nomogram and the UGIB-CLIF-SOFA score. The calibration plots showed optimal agreement between nomogram prediction and actual observation. The C-index of the nomogram using bootstrap (0.729; 95% confidence interval: 0.689-0.766) was higher than that of the other models for predicting survival of CICGIB. We have developed and internally validated a novel nomogram and an easy-to-use scoring system that accurately predicts the mortality probability of CICGIB on the basis of eight easy-to-obtain parameters. External validation is now warranted in future clinical studies.

  11. Robotic versus open pediatric ureteral reimplantation: Costs and complications from a nationwide sample.

    PubMed

    Kurtz, Michael P; Leow, Jeffrey J; Varda, Briony K; Logvinenko, Tanya; Yu, Richard N; Nelson, Caleb P; Chung, Benjamin I; Chang, Steven L

    2016-12-01

    We sought to compare complications and direct costs for open ureteral reimplantation (OUR) versus robot-assisted laparoscopic ureteral reimplantation (RALUR) in a sample of hospitals performing both procedures. Anecdotal reports suggest that use of RALUR is increasing, but little is known of the outcomes and costs nationwide. The aim was to determine the costs and 90-day complications (of any Clavien grade) in a nationwide cohort of pediatric patients undergoing OUR or RALUR. Using the Premier Hospital Database we identified pediatric patients (age < 21 years) who underwent ureteral reimplantation from 2003 to 2013. We compared 90-day complication rates and cost data for RALUR versus OUR using descriptive statistics and hierarchical models. We identified 17 hospitals in which both RALUR and OURs were performed, resulting in a cohort of 1494 OUR and 108 RALUR cases. The median operative time was 232 min for RALUR vs. 180 min for OUR (p = 0.0041). Incidence of any 90-day complications was higher in the RALUR group: 13.0% of RALUR vs. 4.5% of OUR (OR = 3.17, 95% CI: 1.46-6.91, p = 0.0037). The difference remained significant in a multivariate model accounting for clustering among hospitals and surgeons (OR, 3.14; 95% CI, 1.46-6.75; p = 0.0033) (Figure). The median hospital cost for OUR was $7273 versus $9128 for RALUR (p = 0.0499), and the difference persisted in multivariate analysis (p = 0.0043). Fifty-one percent (55/108) of the RALUR cases occurred in 2012-2013. We present the first nationwide sample comparing RALUR and OUR in the pediatric population. There is currently wide variation in the probability of complication reported in the literature. Some variability may be due to differential uptake and experience among centers as they integrate a new procedure into their practice, while some may be due to reporting bias. A strength of the current study is that cost and 90-day postoperative complication data are collected at participating hospitals irrespective of outcomes, providing some immunity from the reporting bias to which individual center surgical series' may be susceptible. Compared with OUR, RALUR was associated with a significantly higher rate of complications as well as higher direct costs even when adjusted for demographic and regional factors. These findings suggest that RALUR should be implemented with caution, particularly at sites with limited robotic experience, and that outcomes for these procedures should be carefully and systematically tracked. Copyright © 2016 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  12. EUCLID: an outcome analysis tool for high-dimensional clinical studies

    NASA Astrophysics Data System (ADS)

    Gayou, Olivier; Parda, David S.; Miften, Moyed

    2007-03-01

    Treatment management decisions in three-dimensional conformal radiation therapy (3DCRT) and intensity-modulated radiation therapy (IMRT) are usually made based on the dose distributions in the target and surrounding normal tissue. These decisions may include, for example, the choice of one treatment over another and the level of tumour dose escalation. Furthermore, biological predictors such as tumour control probability (TCP) and normal tissue complication probability (NTCP), whose parameters available in the literature are only population-based estimates, are often used to assess and compare plans. However, a number of other clinical, biological and physiological factors also affect the outcome of radiotherapy treatment and are often not considered in the treatment planning and evaluation process. A statistical outcome analysis tool, EUCLID, for direct use by radiation oncologists and medical physicists was developed. The tool builds a mathematical model to predict an outcome probability based on a large number of clinical, biological, physiological and dosimetric factors. EUCLID can first analyse a large set of patients, such as from a clinical trial, to derive regression correlation coefficients between these factors and a given outcome. It can then apply such a model to an individual patient at the time of treatment to derive the probability of that outcome, allowing the physician to individualize the treatment based on medical evidence that encompasses a wide range of factors. The software's flexibility allows the clinicians to explore several avenues to select the best predictors of a given outcome. Its link to record-and-verify systems and data spreadsheets allows for a rapid and practical data collection and manipulation. A wide range of statistical information about the study population, including demographics and correlations between different factors, is available. A large number of one- and two-dimensional plots, histograms and survival curves allow for an easy visual analysis of the population. Several visual and analytical methods are available to quantify the predictive power of the multivariate regression model. The EUCLID tool can be readily integrated with treatment planning and record-and-verify systems.

  13. EUCLID: an outcome analysis tool for high-dimensional clinical studies.

    PubMed

    Gayou, Olivier; Parda, David S; Miften, Moyed

    2007-03-21

    Treatment management decisions in three-dimensional conformal radiation therapy (3DCRT) and intensity-modulated radiation therapy (IMRT) are usually made based on the dose distributions in the target and surrounding normal tissue. These decisions may include, for example, the choice of one treatment over another and the level of tumour dose escalation. Furthermore, biological predictors such as tumour control probability (TCP) and normal tissue complication probability (NTCP), whose parameters available in the literature are only population-based estimates, are often used to assess and compare plans. However, a number of other clinical, biological and physiological factors also affect the outcome of radiotherapy treatment and are often not considered in the treatment planning and evaluation process. A statistical outcome analysis tool, EUCLID, for direct use by radiation oncologists and medical physicists was developed. The tool builds a mathematical model to predict an outcome probability based on a large number of clinical, biological, physiological and dosimetric factors. EUCLID can first analyse a large set of patients, such as from a clinical trial, to derive regression correlation coefficients between these factors and a given outcome. It can then apply such a model to an individual patient at the time of treatment to derive the probability of that outcome, allowing the physician to individualize the treatment based on medical evidence that encompasses a wide range of factors. The software's flexibility allows the clinicians to explore several avenues to select the best predictors of a given outcome. Its link to record-and-verify systems and data spreadsheets allows for a rapid and practical data collection and manipulation. A wide range of statistical information about the study population, including demographics and correlations between different factors, is available. A large number of one- and two-dimensional plots, histograms and survival curves allow for an easy visual analysis of the population. Several visual and analytical methods are available to quantify the predictive power of the multivariate regression model. The EUCLID tool can be readily integrated with treatment planning and record-and-verify systems.

  14. The Effect of Resident Involvement in Pelvic Prolapse Surgery: A Retrospective Study From a Nationwide Inpatient Sample.

    PubMed

    Caveney, Maxx; Matthews, Catherine; Mirzazadeh, Majid

    The primary aim of this study was to assess the effect of resident involvement on perioperative complication rates in pelvic organ prolapse surgery using the National Surgical Quality Improvement database. All pelvic organ prolapse operations from 2006 to 2012 were identified and dichotomized by resident participation. Preoperative characteristics and 30-day perioperative outcomes were compared using χ and Student t test. To control for nonrandomization of cases, propensity scores representing the probability of resident involvement as a function of a case's comorbidities were calculated. They were then divided into quartiles, and because of equal probabilities for the first and second quartiles, 3 groups were created (Q1/2, Q3, and Q4), followed by substratification and analysis. As a control, complications of transurethral resection of prostate and nephrectomy were dichotomized by resident involvement. We identified 2637 cases. Resident involvement was associated with increased postoperative urinary tract infections, perioperative complications, and procedure length. After stratification by propensity scoring, the following unique findings occurred in each group: in the first group, resident involvement was associated with increased rates of readmission, pulmonary embolism, and sepsis; in the second and third groups, resident involvement was associated with increased rates of superficial surgical site infection. Resident involvement in nephrectomy observed increased perioperative complications and procedural length. In prostate resection, increased procedure lengths and decreased postoperative length of stay were observed. Resident involvement in pelvic organ prolapse surgery was associated with an increased risk of adverse outcomes. A similar effect was seen with nephrectomy but not with a more simple endoscopic urologic procedure.

  15. Outcomes following total laryngectomy for squamous cell carcinoma: one centre experience.

    PubMed

    Leong, S C; Kartha, S-S; Kathan, C; Sharp, J; Mortimore, S

    2012-12-01

    To evaluate the clinical outcomes of total laryngectomy (TL), complications and factors affecting survival. Retrospective review of hospital electronic database for head and neck squamous cell carcinoma (SCCa). Large district general hospital in England, United Kingdom. Patients who had TL between January 1994 and January 2008. 5-year disease specific survival (DSS) and disease-free survival (DFS). Seventy-one patients were reviewed, of whom 38 (54%) had laryngeal SCCa and 33 (46%) hypopharyngeal SCCa. The overall mean survival period following TL was 42.4 months. The 5-year DSS and DFS was better for laryngeal SCCa compared to hypopharyngeal SCCa, although not statistically significant (P=0.090, P=0.54 respectively). Patients treated for laryngeal SCCa had a mean survival period of 47.5 months compared to 36.5 months for hypopharyngeal disease. Those who had laryngeal recurrence after primary radiotherapy (RT) demonstrated statistically better survival probability than those who had hypopharyngeal recurrence (P=0.011). Patients without cervical lymphadenopathy had statistically better survival (P=0.049). The most common early complication was related to the cardiorespiratory system. One fatal complication of erosion of the brachiocephalic artery due to the laryngectomy tube was noted. The most common late complication was neopharyngeal stenosis. The commonest cause of death was due to locoregional recurrence, followed by medical co-morbidities. Patients referred to specialised head and neck clinic had a better survival probability than those referred to a general ENT clinic (P=0.37). While there is increasing tendency towards laryngeal conservation, total laryngectomy remains a robust treatment option in selected patients. Copyright © 2012. Published by Elsevier Masson SAS.

  16. Intraoperative and early postoperative complications of manual sutureless cataract extraction.

    PubMed

    Iqbal, Yasir; Zia, Sohail; Baig Mirza, Aneeq Ullah

    2014-04-01

    To determine the intraoperative and early postoperative complications of manual sutureless cataract extraction. Case series. Redo Eye Hospital, Rawalpindi, Pakistan, from January 2009 to December 2010. Three hundred patients of cataract through purposive non-probability sampling were selected. The patients underwent manual sutureless cataract surgery (MSCS) by single experienced surgeon and intraoperative complications were documented. The surgical technique was modified to deal with any intraoperative complications accordingly. Patients were examined on the first postoperative day and on the first postoperative week for any postoperative complications. The data was entered in Statistical Package for Social Sciences (SPSS) version 13.0 and the results were calculated in frequencies. Among the 300 cases, 81.3% surgeries went uneventful whereas 18.6% had some complication. The common intraoperative complications were superior button-hole formation in 5%; posterior capsular rent in 5% and premature entry with iris prolapse in 3% cases. Postoperatively, the commonly encountered complications were striate keratopathy in 9.6% and hyphema 9%. At first week follow-up, 4% had striate keratopathy and 0.6% had hyphema. Striate keratopathy resolved with topical medication on subsequent follow-up. A total of 9 cases (3%) underwent second surgery: 2 cases for lens matter wash, 2 cases for hyphema and 5 cases needed suturing of wound for shallow anterior chamber due to wound leak. Superior button-hole formation, posterior capsular rent and premature entry were the common intraoperative complications of MSCS whereas the common early postoperative complications were striate keratopathy and hyphema.

  17. Evaluating impacts using a BACI design, ratios, and a Bayesian approach with a focus on restoration.

    PubMed

    Conner, Mary M; Saunders, W Carl; Bouwes, Nicolaas; Jordan, Chris

    2015-10-01

    Before-after-control-impact (BACI) designs are an effective method to evaluate natural and human-induced perturbations on ecological variables when treatment sites cannot be randomly chosen. While effect sizes of interest can be tested with frequentist methods, using Bayesian Markov chain Monte Carlo (MCMC) sampling methods, probabilities of effect sizes, such as a ≥20 % increase in density after restoration, can be directly estimated. Although BACI and Bayesian methods are used widely for assessing natural and human-induced impacts for field experiments, the application of hierarchal Bayesian modeling with MCMC sampling to BACI designs is less common. Here, we combine these approaches and extend the typical presentation of results with an easy to interpret ratio, which provides an answer to the main study question-"How much impact did a management action or natural perturbation have?" As an example of this approach, we evaluate the impact of a restoration project, which implemented beaver dam analogs, on survival and density of juvenile steelhead. Results indicated the probabilities of a ≥30 % increase were high for survival and density after the dams were installed, 0.88 and 0.99, respectively, while probabilities for a higher increase of ≥50 % were variable, 0.17 and 0.82, respectively. This approach demonstrates a useful extension of Bayesian methods that can easily be generalized to other study designs from simple (e.g., single factor ANOVA, paired t test) to more complicated block designs (e.g., crossover, split-plot). This approach is valuable for estimating the probabilities of restoration impacts or other management actions.

  18. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  19. Transport of chromium and selenium in the suboxic zone of a shallow aquifer: Influence of redox and adsorption reactions

    USGS Publications Warehouse

    Kent, D.B.; Davis, J.A.; Anderson, L.C.D.; Rea, B.A.; Waite, T.D.

    1994-01-01

    Breakthrough of Cr(VI) (chromate), Se(VI) (selenate), and O2 (dissolved oxygen) was observed in tracer tests conducted in a shallow, sand and gravel aquifer with mildly reducing conditions. Loss of Cr, probably due to reduction of Cr(VI) to Cr(III) and irreversible sorption of Cr(III), occurred along with slight retardation of Cr(VI), owing to reversible sorption. Reduction of Se(VI) and O2 was thermodynamically feasible but did not occur, indicating conditions, were unfavorable to microbial reduction. Cr(VI) reduction by constituents of aquifer sediments did not achieve local equilibrium during transport. The reduction rate was probably limited by incomplete contact between Cr(VI) transported along predominant flow paths and reductants located in regions within aquifer sediments of comparatively low permeability. Scatter in the amount of Cr reduction calculated from individual breakthrough curves at identical distances downgradient probably resulted from heterogeneities in the distribution of reductants in the sediments. Predictive modeling of the transport and fate of redox-sensitive solutes cannot be based strictly on thermodynamic considerations; knowledge of reaction rates is critical. Potentially important mass transfer rate limitations between solutes and reactants in sediments as well as heterogeneities in the distribution of redox properties in aquifers complicate determination of limiting rates for use in predictive simulations of the transport of redox-sensitive contaminants in groundwater.

  20. Screening for Chlamydia trachomatis: a systematic review of the economic evaluations and modelling

    PubMed Central

    Roberts, T E; Robinson, S; Barton, P; Bryan, S; Low, N

    2006-01-01

    Objective To review systematically and critically, evidence used to derive estimates of costs and cost effectiveness of chlamydia screening. Methods Systematic review. A search of 11 electronic bibliographic databases from the earliest date available to August 2004 using keywords including chlamydia, pelvic inflammatory disease, economic evaluation, and cost. We included studies of chlamydia screening in males and/or females over 14 years, including studies of diagnostic tests, contact tracing, and treatment as part of a screening programme. Outcomes included cases of chlamydia identified and major outcomes averted. We assessed methodological quality and the modelling approach used. Results Of 713 identified papers we included 57 formal economic evaluations and two cost studies. Most studies found chlamydia screening to be cost effective, partner notification to be an effective adjunct, and testing with nucleic acid amplification tests, and treatment with azithromycin to be cost effective. Methodological problems limited the validity of these findings: most studies used static models that are inappropriate for infectious diseases; restricted outcomes were used as a basis for policy recommendations; and high estimates of the probability of chlamydia associated complications might have overestimated cost effectiveness. Two high quality dynamic modelling studies found opportunistic screening to be cost effective but poor reporting or uncertainty about complication rates make interpretation difficult. Conclusion The inappropriate use of static models to study interventions to prevent a communicable disease means that uncertainty remains about whether chlamydia screening programmes are cost effective or not. The results of this review can be used by health service managers in the allocation of resources, and health economists and other researchers who are considering further research in this area. PMID:16731666

  1. Principal Score Methods: Assumptions, Extensions, and Practical Considerations

    ERIC Educational Resources Information Center

    Feller, Avi; Mealli, Fabrizia; Miratrix, Luke

    2017-01-01

    Researchers addressing posttreatment complications in randomized trials often turn to principal stratification to define relevant assumptions and quantities of interest. One approach for the subsequent estimation of causal effects in this framework is to use methods based on the "principal score," the conditional probability of belonging…

  2. Mechanical Parametric Oscillations and Waves

    ERIC Educational Resources Information Center

    Dittrich, William; Minkin, Leonid; Shapovalov, Alexander S.

    2013-01-01

    Usually parametric oscillations are not the topic of general physics courses. Probably it is because the mathematical theory of this phenomenon is relatively complicated, and until quite recently laboratory experiments for students were difficult to implement. However parametric oscillations are good illustrations of the laws of physics and can be…

  3. Economic considerations of breeding for polled dairy cows versus dehorning in the United States

    USDA-ARS?s Scientific Manuscript database

    Dairy producers today face labor, equipment, and medical costs associated with dehorning heifers. Further, complications requiring veterinary intervention occur with some probability. The objective of this work is to develop preliminary cost estimates of selecting for polled dairy heifers. Stochasti...

  4. Scientific assessment of accuracy, skill and reliability of ocean probabilistic forecast products.

    NASA Astrophysics Data System (ADS)

    Wei, M.; Rowley, C. D.; Barron, C. N.; Hogan, P. J.

    2016-02-01

    As ocean operational centers are increasingly adopting and generating probabilistic forecast products for their customers with valuable forecast uncertainties, how to assess and measure these complicated probabilistic forecast products objectively is challenging. The first challenge is how to deal with the huge amount of the data from the ensemble forecasts. The second one is how to describe the scientific quality of probabilistic products. In fact, probabilistic forecast accuracy, skills, reliability, resolutions are different attributes of a forecast system. We briefly introduce some of the fundamental metrics such as the Reliability Diagram, Reliability, Resolution, Brier Score (BS), Brier Skill Score (BSS), Ranked Probability Score (RPS), Ranked Probability Skill Score (RPSS), Continuous Ranked Probability Score (CRPS), and Continuous Ranked Probability Skill Score (CRPSS). The values and significance of these metrics are demonstrated for the forecasts from the US Navy's regional ensemble system with different ensemble members. The advantages and differences of these metrics are studied and clarified.

  5. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    NASA Astrophysics Data System (ADS)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance assessed. The density of survey points required to achieve a required probability of detection can be calculated. The software aids discriminate choice of technique, improves survey design, and increases the likelihood of survey success; all factors sought in the engineering industry. As a simple example, the response from magnetometry, gravimetry, and gravity gradient techniques above an example 3m deep, 1m cube air cavity in limestone across a 15m grid was calculated. The maximum responses above the cavity are small (amplitudes of 0.018nT, 0.0013mGal, 8.3eotvos respectively), but at typical site noise levels the detection reliability is over 50% for the gradient gravity method on a single survey line. Increasing the number of survey points across the site increases the reliability of detection of the anomaly by the addition of probabilities. We can calculate the probability of detection at different profile spacings to assess the best possible survey design. At 1m spacing the overall probability of by the gradient gravity method is over 90%, and over 60% for magnetometry (at 3m spacing the probability drops to 32%). The use of modelling in near surface surveys is a useful tool to assess the feasibility of a range of techniques to detect subtle signals. Future work will integrate this work with borehole measured parameters.

  6. Predicting medical complications after spine surgery: a validated model using a prospective surgical registry.

    PubMed

    Lee, Michael J; Cizik, Amy M; Hamilton, Deven; Chapman, Jens R

    2014-02-01

    The possibility and likelihood of a postoperative medical complication after spine surgery undoubtedly play a major role in the decision making of the surgeon and patient alike. Although prior study has determined relative risk and odds ratio values to quantify risk factors, these values may be difficult to translate to the patient during counseling of surgical options. Ideally, a model that predicts absolute risk of medical complication, rather than relative risk or odds ratio values, would greatly enhance the discussion of safety of spine surgery. To date, there is no risk stratification model that specifically predicts the risk of medical complication. The purpose of this study was to create and validate a predictive model for the risk of medical complication during and after spine surgery. Statistical analysis using a prospective surgical spine registry that recorded extensive demographic, surgical, and complication data. Outcomes examined are medical complications that were specifically defined a priori. This analysis is a continuation of statistical analysis of our previously published report. Using a prospectively collected surgical registry of more than 1,476 patients with extensive demographic, comorbidity, surgical, and complication detail recorded for 2 years after surgery, we previously identified several risk factor for medical complications. Using the beta coefficients from those log binomial regression analyses, we created a model to predict the occurrence of medical complication after spine surgery. We split our data into two subsets for internal and cross-validation of our model. We created two predictive models: one predicting the occurrence of any medical complication and the other predicting the occurrence of a major medical complication. The final predictive model for any medical complications had a receiver operator curve characteristic of 0.76, considered to be a fair measure. The final predictive model for any major medical complications had receiver operator curve characteristic of 0.81, considered to be a good measure. The final model has been uploaded for use on SpineSage.com. We present a validated model for predicting medical complications after spine surgery. The value in this model is that it gives the user an absolute percent likelihood of complication after spine surgery based on the patient's comorbidity profile and invasiveness of surgery. Patients are far more likely to understand an absolute percentage, rather than relative risk and confidence interval values. A model such as this is of paramount importance in counseling patients and enhancing the safety of spine surgery. In addition, a tool such as this can be of great use particularly as health care trends toward pay-for-performance, quality metrics, and risk adjustment. To facilitate the use of this model, we have created a website (SpineSage.com) where users can enter in patient data to determine likelihood of medical complications after spine surgery. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Current techniques and outcomes in hysteroscopic sterilization: current evidence, considerations, and complications with hysteroscopic sterilization micro inserts.

    PubMed

    Casey, James; Cedo-Cintron, Laura; Pearce, Jessica; Yunker, Amanda

    2017-08-01

    To describe the current data regarding effectiveness, complications, postoperative evaluation, and surgical interventions associated with Essure hysteroscopic sterilization. Hysteroscopic sterilization is a commonly performed procedure that is offered as a well tolerated, effective, outpatient method of permanent sterilization. Over the past several years, concerns have been raised regarding correct placement and postoperative complications. This has led to statements by both the Food and Drug Administration (FDA) in October, 2016 and American Association of Gynecologic Laparoscopists in February, 2017, as a significant portion of women seek removal of these devices. A current black-box warning issued by the FDA in 2016 recommends discussion of 'the probabilities of rates or events' of adverse outcomes associated with Essure placement. Although hysteroscopic sterilization is usually a safe, effective option for permanent contraception, new evidence regarding complications has emphasized the need for proper education and counseling. Appropriate patient selection and knowledge of potential complications is paramount to ensuring patients, and medical providers are well informed and have realistic expectations regarding potential placement and postoperative issues.

  8. Visual Prognosis and Ocular Complications in Herpetic versus HLA-B27- or Ankylosing Spondylitis-associated Anterior Uveitis.

    PubMed

    Hoeksema, Lisette; Los, Leonoor I

    2016-06-01

    To investigate the visual prognosis and ocular complications in patients with herpetic versus HLA-B27 associated anterior uveitis (AU). This was a retrospective, observational study conducted at the ophthalmology department of the University Medical Center of Groningen. Sixty-two herpetic and 113 HLA-B27-associated AU patients were included. The main outcome measures were visual acuity and ocular complications. Visual acuity over time was significantly lower in herpetic as compared to HLA-B27 AU, mainly due to corneal scarring. The incidence rate of any ocular complication was higher in herpetic AU compared to HLA-B27-associated AU (0.140/EY versus 0.076/EY, p = <0.001), which was mainly due to glaucoma (0.033/EY versus 0.004/EY, p < 0.001) and cataract (0.059/EY versus 0.023/EY, p < 0.001). The most prominent finding was a worse visual prognosis in herpetic AU, which is probably related to higher prevalence of corneal scarring and glaucoma. In addition, herpetic AU patients have more ocular complications overall.

  9. [Polycystic ovary syndrome: an example of obesity-related cardiovascular complication affecting young women].

    PubMed

    Orio, Francesco; Cascella, Teresa; Giallauria, Francesco; Palomba, Stefano; De Lorenzo, Anna; Lucci, Rosa; Ambrosino, Elena; Lombardi, Gaetano; Colao, Annamaria; Vigorito, Carlo

    2006-03-01

    Polycystic ovary syndrome (PCOS) is a good example of obesity-related cardiovascular complication affecting young women. PCOS is not only considered a reproductive problem but rather represents a complex endocrine, multifaceted syndrome with important health implications. Several evidences suggest an increased cardiovascular risk of cardiovascular disease associated with this syndrome, characterized by an impairment of heart structure and function, endothelial dysfunction and lipid abnormalities. All these features, probably linked to insulin-resistance, are often present in obese PCOS patients. Cardiovascular abnormalities represent important long-term sequelae of PCOS that need further investigations.

  10. Analysis of Electronic Densities and Integrated Doses in Multiform Glioblastomas Stereotactic Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baron-Aznar, C.; Moreno-Jimenez, S.; Celis, M. A.

    2008-08-11

    Integrated dose is the total energy delivered in a radiotherapy target. This physical parameter could be a predictor for complications such as brain edema and radionecrosis after stereotactic radiotherapy treatments for brain tumors. Integrated Dose depends on the tissue density and volume. Using CT patients images from the National Institute of Neurology and Neurosurgery and BrainScan(c) software, this work presents the mean density of 21 multiform glioblastomas, comparative results for normal tissue and estimated integrated dose for each case. The relationship between integrated dose and the probability of complications is discussed.

  11. Cost effectiveness of mirabegron compared with tolterodine extended release for the treatment of adults with overactive bladder in the United Kingdom.

    PubMed

    Aballéa, Samuel; Maman, Khaled; Thokagevistk, Katia; Nazir, Jameel; Odeyemi, Isaac A O; Hakimi, Zalmai; Garnham, Andy; Toumi, Mondher

    2015-02-01

    Overactive bladder (OAB) is highly prevalent and is associated with considerable morbidity and reduced health-related quality of life. β3-adrenergic receptor (β3-AR) stimulation is a novel alternative to antimuscarinic therapy for OAB. The objective of this analysis was to assess the cost effectiveness of the β3-AR agonist mirabegron relative to tolterodine extended release (ER) in patients with OAB from a UK National Health Service (NHS) perspective. A Markov model was developed to simulate the management, course of disease, and effect of complications in OAB patients over a period of 5 years. Transition probabilities for symptom severity levels and probabilities of adverse events were estimated from the results of the randomised, double-blind SCORPIO trial in 1,987 patients with OAB. Other model inputs were derived from the literature and on assumptions based on clinical experience. Total 5-year costs per patient were £1,645.62 for mirabegron 50 mg/day and £1,607.75 for tolterodine ER 4 mg/day. Mirabegron was associated with a gain of 0.009 quality-adjusted life-years (QALYs) with an additional cost of £37.88. The resulting incremental cost-effectiveness ratio (ICER) was £4,386/QALY gained. In deterministic sensitivity analyses in the general OAB population and several subgroups, ICERs remained below the generally accepted willingness-to-pay (WTP) threshold of £20,000/QALY gained. The probability of mirabegron 50 mg being cost effective relative to tolterodine ER 4 mg was 89.4 % at the same WTP threshold. Mirabegron 50 mg/day is likely to be cost effective compared with tolterodine ER 4 mg/day for adult patients with OAB from a UK NHS perspective.

  12. A Guide to the Gynoecium

    ERIC Educational Resources Information Center

    Burrows, G. E.

    2010-01-01

    Understanding flower structure is important from many perspectives such as keying out plants, understanding fruit structure, investigating pollinator biology and in plant breeding. Probably the most complicated parts of a flower are the female components (the gynoecium). Unfortunately, in many parts of the world, it is not possible for much of the…

  13. Lighten the Load

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2008-01-01

    The green movement in school design encompasses many techniques to improve the environmental friendliness and energy efficiency of a facility. Some are more complicated than others--probably not many people can explain the intricacies of a geothermal heating system, or the specifics of how solar or wind energy is harnessed. Most people, however,…

  14. Current best management practices for harvesting and storing dry hay: a research review

    USDA-ARS?s Scientific Manuscript database

    The production of high-quality grass or legume hays in humid environments is complicated by slower drying rates, and increased probability of rainfall events compared to hay produced under arid climatic conditions. As a result, hay producers in humid environments often face the management dilemma of...

  15. Clinical outcome in patients from a single region who were dependent on parenteral nutrition for 28 days or more.

    PubMed

    Köglmeier, J; Day, C; Puntis, J W L

    2008-04-01

    The frequency and outcome of intestinal failure (IF) in children are not well defined in the UK. Long-term parenteral nutrition (PN) is an effective intervention, with intestine transplantation offering the possibility of survival should life-threatening complications arise in those with long-term dependency. The ideal model for service provision is a subject of debate. We aimed to identify all new cases of IF (defined as PN dependency > or =28 days) in West Yorkshire over a two-year period to determine the rate of serious complications, establish the outcome after two years and clarify the role of specialist referral. Pharmacists in all the West Yorkshire paediatric units were contacted to establish the number of children with IF during 2001-2002. Underlying diagnosis, complications and outcome at two years were obtained by case-note review for 93 of the 96 children identified. IF patients were exclusively managed in one or other of the three large teaching hospitals. At the two-year follow-up, six (6.4%) children had died (one while listed for a small bowel transplantation), but 85 (91%) had established full enteral feeding and were well. Two remained PN dependent and were assessed in the supra-regional intestinal transplantation unit (Birmingham); in neither case was small bowel transplantation thought to be appropriate. The most common complications were central venous catheter sepsis (69% of patients) and cholestasis (59%). This study shows that a favourable outcome for IF can be achieved in a regional centre with appropriate multidisciplinary support. A single UK supra-regional unit undertaking small bowel transplantation is probably adequate for assessment of the most complex patients, although this should remain under review.

  16. Use of sexually transmitted disease risk assessment algorithms for selection of intrauterine device candidates.

    PubMed

    Morrison, C S; Sekadde-Kigondu, C; Miller, W C; Weiner, D H; Sinei, S K

    1999-02-01

    Sexually transmitted diseases (STD) are an important contraindication for intrauterine device (IUD) insertion. Nevertheless, laboratory testing for STD is not possible in many settings. The objective of this study is to evaluate the use of risk assessment algorithms to predict STD and subsequent IUD-related complications among IUD candidates. Among 615 IUD users in Kenya, the following algorithms were evaluated: 1) an STD algorithm based on US Agency for International Development (USAID) Technical Working Group guidelines: 2) a Centers for Disease Control and Prevention (CDC) algorithm for management of chlamydia; and 3) a data-derived algorithm modeled from study data. Algorithms were evaluated for prediction of chlamydial and gonococcal infection at 1 month and complications (pelvic inflammatory disease [PID], IUD removals, and IUD expulsions) over 4 months. Women with STD were more likely to develop complications than women without STD (19% vs 6%; risk ratio = 2.9; 95% CI 1.3-6.5). For STD prediction, the USAID algorithm was 75% sensitive and 48% specific, with a positive likelihood ratio (LR+) of 1.4. The CDC algorithm was 44% sensitive and 72% specific, LR+ = 1.6. The data-derived algorithm was 91% sensitive and 56% specific, with LR+ = 2.0 and LR- = 0.2. Category-specific LR for this algorithm identified women with very low (< 1%) and very high (29%) infection probabilities. The data-derived algorithm was also the best predictor of IUD-related complications. These results suggest that use of STD algorithms may improve selection of IUD users. Women at high risk for STD could be counseled to avoid IUD, whereas women at moderate risk should be monitored closely and counseled to use condoms.

  17. Evaluation of the cost-effectiveness of electrical stimulation therapy for pressure ulcers in spinal cord injury.

    PubMed

    Mittmann, Nicole; Chan, Brian C; Craven, B Cathy; Isogai, Pierre K; Houghton, Pamela

    2011-06-01

    To evaluate the incremental cost-effectiveness of electrical stimulation (ES) plus standard wound care (SWC) as compared with SWC only in a spinal cord injury (SCI) population with grade III/IV pressure ulcers (PUs) from the public payer perspective. A decision analytic model was constructed for a 1-year time horizon to determine the incremental cost-effectiveness of ES plus SWC to SWC in a cohort of participants with SCI and grade III/IV PUs. Model inputs for clinical probabilities were based on published literature. Model inputs, namely clinical probabilities and direct health system and medical resources were based on a randomized controlled trial of ES plus SWC versus SWC. Costs (Can $) included outpatient (clinic, home care, health professional) and inpatient management (surgery, complications). One way and probabilistic sensitivity (1000 Monte Carlo iterations) analyses were conducted. The perspective of this analysis is from a Canadian public health system payer. Model target population was an SCI cohort with grade III/IV PUs. Not applicable. Incremental cost per PU healed. ES plus SWC were associated with better outcomes and lower costs. There was a 16.4% increase in the PUs healed and a cost savings of $224 at 1 year. ES plus SWC were thus considered a dominant economic comparator. Probabilistic sensitivity analysis resulted in economic dominance for ES plus SWC in 62%, with another 35% having incremental cost-effectiveness ratios of $50,000 or less per PU healed. The largest driver of the economic model was the percentage of PU healed with ES plus SWC. The addition of ES to SWC improved healing in grade III/IV PU and reduced costs in an SCI population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Laparoscopic and robot-assisted vs open radical prostatectomy for the treatment of localized prostate cancer: a Cochrane systematic review.

    PubMed

    Ilic, Dragan; Evans, Sue M; Allan, Christie Ann; Jung, Jae Hung; Murphy, Declan; Frydenberg, Mark

    2018-06-01

    To determine the effects of laparoscopic radical prostatectomy (LRP), or robot-assisted radical prostatectomy (RARP) compared with open radical prostatectomy (ORP) in men with localized prostate cancer. We performed a comprehensive search using multiple databases (CENTRAL, MEDLINE, EMBASE) and abstract proceedings, with no restrictions on the language of publication or publication status, up until 9 June 2017. We included all randomized or pseudo-randomized controlled trials that directly compared LRP and RARP with ORP. Two review authors independently examined full-text reports, identified relevant studies, assessed the eligibility of studies for inclusion, extracted data and assessed risk of bias. We performed statistical analyses using a random-effects model and assessed the quality of the evidence according to Grading of Recommendations Assessment, Development and Evaluation (GRADE). The primary outcomes were prostate cancer-specific survival, urinary quality of life and sexual quality of life. Secondary outcomes were biochemical recurrence-free survival, overall survival, overall surgical complications, serious postoperative surgical complications, postoperative pain, hospital stay and blood transfusions. We included two unique studies in a total of 446 randomized participants with clinically localized prostate cancer. All available outcome data were short-term (up to 3 months). We found no study that addressed the outcome of prostate cancer-specific survival. Based on one trial, RARP probably results in little to no difference in urinary quality of life (mean difference [MD] -1.30, 95% confidence interval [CI] -4.65 to 2.05; moderate quality of evidence) and sexual quality of life (MD 3.90, 95% CI: -1.84 to 9.64; moderate quality of evidence). No study addressed the outcomes of biochemical recurrence-free survival or overall survival. Based on one trial, RARP may result in little to no difference in overall surgical complications (risk ratio [RR] 0.41, 95% CI: 0.16-1.04; low quality of evidence) or serious postoperative complications (RR 0.16, 95% CI: 0.02-1.32; low quality of evidence). Based on two studies, LRP or RARP may result in a small, possibly unimportant improvement in postoperative pain at 1 day (MD -1.05, 95% CI: -1.42 to -0.68; low quality of evidence) and up to 1 week (MD -0.78, 95% CI: -1.40 to -0.17; low quality of evidence). Based on one study, RARP probably results in little to no difference in postoperative pain at 12 weeks (MD 0.01, 95% CI: -0.32 to 0.34; moderate quality of evidence). Based on one study, RARP probably reduces the length of hospital stay (MD -1.72, 95% CI: -2.19 to -1.25; moderate quality of evidence). Based on two studies, LRP or RARP may reduce the frequency of blood transfusions (RR 0.24, 95% CI: 0.12-0.46; low quality of evidence). Assuming a baseline risk for a blood transfusion to be 8.9%, LRP or RARP would result in 68 fewer blood transfusions per 1,000 men (95% CI: 78-48 fewer). There is no evidence to inform the comparative effectiveness of LRP or RARP compared with ORP for oncological outcomes. Urinary and sexual quality of life appear similar. Overall and serious postoperative complication rates appear similar. The difference in postoperative pain may be minimal. Men undergoing LRP or RARP may have a shorter hospital stay and receive fewer blood transfusions. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  19. WE-F-304-04: Radiosurgery for Vestibular Schwannomas: Tumor Control Probability Analyses and Recommended Reporting Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soltys, S.

    Stereotactic Body Radiation Therapy (SBRT) was introduced clinically more than twenty years ago, and many subsequent publications have reported safety and efficacy data. The AAPM Working Group on Biological Effects of Hypofractionated Radiotherapy/SBRT (WGSBRT) extracted published treatment outcomes data from extensive literature searches to summarize and construct tumor control probability (TCP) and normal tissue complication probability (NTCP) models for six anatomical regions: Cranial, Head and Neck, Thoracic, Abdominal, Pelvic, and Spinal. In this session, we present the WGSBRT’s work for cranial sites, and recurrent head and neck cancer. From literature-based data and associated models, guidelines to aid with safe andmore » effective hypofractionated radiotherapy treatment are being determined. Further, the ability of existing and proposed radiobiological models to fit these data is considered as to the ability to distinguish between the linear-quadratic and alternative radiobiological models such as secondary cell death from vascular damage, immunogenic, or bystander effects. Where appropriate, specific model parameters are estimated. As described in “The lessons of QUANTEC,” (1), lack of adequate reporting standards continues to limit the amount of useful quantitative information that can be extracted from peer-reviewed publications. Recommendations regarding reporting standards are considered, to enable such reviews to achieve more complete characterization of clinical outcomes. 1 Jackson A, Marks LB, Bentzen SM, Eisbruch A, Yorke ED, Ten Haken RK, Constine LS, Deasy JO. The lessons of QUANTEC: recommendations for reporting and gathering data on dose-volume dependencies of treatment outcome. Int J Radiat Oncol Biol Phys. 2010 Mar 1;76(3 Suppl):S155–60. Learning Objectives: Describe the techniques, types of cancer and dose schedules used in treating recurrent H&N cancers with SBRT List the radiobiological models that compete with the linear-quadratic model in explaining the results of hypofractionated RT Describe the dose/volume metrics that are considered safe in SBRT treatment of tumors near the optic structures. Discuss the efficacy of hypofractionation and dosing schedules used in treating vestibular schwannomas Identify some difficulties in modeling TCP and NTCP for cranial tumors treated with hypofractionation. One moderator, Dr. Grimm, designed and holds intellectual property rights to the DVH Evaluator software tool which is an FDA-cleared product in commercial use, and can analyze some of this data. No others have relevant conflicts of interest.« less

  20. Deep Brain Stimulation for Parkinson's Disease with Early Motor Complications: A UK Cost-Effectiveness Analysis.

    PubMed

    Fundament, Tomasz; Eldridge, Paul R; Green, Alexander L; Whone, Alan L; Taylor, Rod S; Williams, Adrian C; Schuepbach, W M Michael

    2016-01-01

    Parkinson's disease (PD) is a debilitating illness associated with considerable impairment of quality of life and substantial costs to health care systems. Deep brain stimulation (DBS) is an established surgical treatment option for some patients with advanced PD. The EARLYSTIM trial has recently demonstrated its clinical benefit also in patients with early motor complications. We sought to evaluate the cost-effectiveness of DBS, compared to best medical therapy (BMT), among PD patients with early onset of motor complications, from a United Kingdom (UK) payer perspective. We developed a Markov model to represent the progression of PD as rated using the Unified Parkinson's Disease Rating Scale (UPDRS) over time in patients with early PD. Evidence sources were a systematic review of clinical evidence; data from the EARLYSTIM study; and a UK Clinical Practice Research Datalink (CPRD) dataset including DBS patients. A mapping algorithm was developed to generate utility values based on UPDRS data for each intervention. The cost-effectiveness was expressed as the incremental cost per quality-adjusted life-year (QALY). One-way and probabilistic sensitivity analyses were undertaken to explore the effect of parameter uncertainty. Over a 15-year time horizon, DBS was predicted to lead to additional mean cost per patient of £26,799 compared with BMT (£73,077/patient versus £46,278/patient) and an additional mean 1.35 QALYs (6.69 QALYs versus 5.35 QALYs), resulting in an incremental cost-effectiveness ratio of £19,887 per QALY gained with a 99% probability of DBS being cost-effective at a threshold of £30,000/QALY. One-way sensitivity analyses suggested that the results were not significantly impacted by plausible changes in the input parameter values. These results indicate that DBS is a cost-effective intervention in PD patients with early motor complications when compared with existing interventions, offering additional health benefits at acceptable incremental cost. This supports the extended use of DBS among patients with early onset of motor complications.

  1. Influence of sub-specialty surgical care on outcomes for pediatric emergency general surgery patients in a low-middle income country.

    PubMed

    Shah, Adil A; Shakoor, Amarah; Zogg, Cheryl K; Oyetunji, Tolulope; Ashfaq, Awais; Garvey, Erin M; Latif, Asad; Riviello, Robert; Qureshi, Faisal G; Mateen, Arif; Haider, Adil H; Zafar, Hasnain

    2016-05-01

    Whether adult general surgeons should handle pediatric emergencies is controversial. In many resource-limited settings, pediatric surgeons are not available. The study examined differences in surgical outcomes among children/adolescents managed by pediatric and adult general surgery teams for emergency general surgical (EGS) conditions at a university-hospital in South Asia. Pediatric patients (<18y) admitted with an EGS diagnosis (March 2009-April 2014) were included. Patients were dichotomized by adult vs. pediatric surgical management team. Outcome measures included: length of stay (LOS), mortality, and occurrence of ≥1 complication(s). Descriptive statistics and multivariable regression analyses with propensity scores to account for potential confounding were used to compare outcomes between the two groups. Quasi-experimental counterfactual models further examined hypothetical outcomes, assuming that all patients had been treated by pediatric surgeons. A total of 2323 patients were included. Average age was 7.1y (±5.5 SD); most patients were male (77.7%). 1958 (84.3%) were managed by pediatric surgery. The overall probability of developing a complication was 1.8%; 0.9% died (all adult general surgery). Patients managed by adult general surgery had higher risk-adjusted odds of developing complications (OR [95%CI]: 5.42 [2.10-14.00]) and longer average LOS (7.98 vs. 5.61 days, p < 0.01). 39.8% fewer complications and an 8.2% decrease in LOS would have been expected if all patients had been managed by pediatric surgery. Pediatric patients had better post-operative outcomes under pediatric surgical supervision, suggesting that, where possible in resource-constrained settings, resources should be allocated to promote development and staffing of pediatric surgical specialties parallel to adult general surgical teams. Copyright © 2016 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  2. Automated segmentation of ultrasonic breast lesions using statistical texture classification and active contour based on probability distance.

    PubMed

    Liu, Bo; Cheng, H D; Huang, Jianhua; Tian, Jiawei; Liu, Jiafeng; Tang, Xianglong

    2009-08-01

    Because of its complicated structure, low signal/noise ratio, low contrast and blurry boundaries, fully automated segmentation of a breast ultrasound (BUS) image is a difficult task. In this paper, a novel segmentation method for BUS images without human intervention is proposed. Unlike most published approaches, the proposed method handles the segmentation problem by using a two-step strategy: ROI generation and ROI segmentation. First, a well-trained texture classifier categorizes the tissues into different classes, and the background knowledge rules are used for selecting the regions of interest (ROIs) from them. Second, a novel probability distance-based active contour model is applied for segmenting the ROIs and finding the accurate positions of the breast tumors. The active contour model combines both global statistical information and local edge information, using a level set approach. The proposed segmentation method was performed on 103 BUS images (48 benign and 55 malignant). To validate the performance, the results were compared with the corresponding tumor regions marked by an experienced radiologist. Three error metrics, true-positive ratio (TP), false-negative ratio (FN) and false-positive ratio (FP) were used for measuring the performance of the proposed method. The final results (TP = 91.31%, FN = 8.69% and FP = 7.26%) demonstrate that the proposed method can segment BUS images efficiently, quickly and automatically.

  3. The significance of the choice of radiobiological (NTCP) models in treatment plan objective functions.

    PubMed

    Miller, J; Fuller, M; Vinod, S; Suchowerska, N; Holloway, L

    2009-06-01

    A Clinician's discrimination between radiation therapy treatment plans is traditionally a subjective process, based on experience and existing protocols. A more objective and quantitative approach to distinguish between treatment plans is to use radiobiological or dosimetric objective functions, based on radiobiological or dosimetric models. The efficacy of models is not well understood, nor is the correlation of the rank of plans resulting from the use of models compared to the traditional subjective approach. One such radiobiological model is the Normal Tissue Complication Probability (NTCP). Dosimetric models or indicators are more accepted in clinical practice. In this study, three radiobiological models, Lyman NTCP, critical volume NTCP and relative seriality NTCP, and three dosimetric models, Mean Lung Dose (MLD) and the Lung volumes irradiated at 10Gy (V10) and 20Gy (V20), were used to rank a series of treatment plans using, harm to normal (Lung) tissue as the objective criterion. None of the models considered in this study showed consistent correlation with the Radiation Oncologists plan ranking. If radiobiological or dosimetric models are to be used in objective functions for lung treatments, based on this study it is recommended that the Lyman NTCP model be used because it will provide most consistency with traditional clinician ranking.

  4. Bayesics

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2005-11-01

    This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth

  5. Application of thoracic endovascular aortic repair (TEVAR) in treating dwarfism with Stanford B aortic dissection

    PubMed Central

    Qiu, Jian; Cai, Wenwu; Shu, Chang; Li, Ming; Xiong, Qinggen; Li, Quanming; Li, Xin

    2018-01-01

    Abstract Rationale: To apply thoracic endovascular aortic repair (TEVAR) to treat dwarfism complicated with Stanford B aortic dissection. Patient concerns: In this report, we presented a 63-year-old male patient of dwarfism complicated with Stanford B aortic dissection successfully treated with TEVAR. Diagnoses: He was diagnosed with dwarfism complicated with Stanford B aortic dissection. Interventions: After conservative treatment, the male patient underwent TEVAR at 1 week after hospitalization. After operation, he presented with numbness and weakness of his bilateral lower extremities, and these symptoms were significantly mitigated after effective treatment. At 1- and 3-week after TEVAR, the aorta status was maintained stable and restored. Outcomes: The patient obtained favorable clinical prognosis and was smoothly discharged. During subsequent follow-up, he remained physically stable. Lessons: TEVAR is probably an option for treating dwarfism complicated with Stanford B aortic dissection, which remains to be validated by subsequent studies with larger sample size. PMID:29703033

  6. Complications of subthalamic nucleus stimulation in Parkinson's disease.

    PubMed

    Umemura, Atsushi; Oka, Yuichi; Yamamoto, Kenichi; Okita, Kenji; Matsukawa, Noriyuki; Yamada, Kazuo

    2011-01-01

    Subthalamic nucleus deep brain stimulation (STN-DBS) is effective for medically refractory Parkinson's disease. We retrospectively analyzed complications in 180 consecutive patients who underwent bilateral STN-DBS. Surgery-related complications were symptomatic intracerebral hemorrhage in 2, chronic subdural hematoma in 1, and transient deterioration of medication-induced psychosis in 2 patients. Device-related complications involved device infection in 5, skin erosion in 5, and implantable pulse generator malfunction in 2 patients. All of these patients required surgical repair. Surgery and device-related complications could be reduced with increased surgical experience and the introduction of new surgical equipment and technology. Treatment or stimulation-related complications were intractable dyskinesia/dystonia in 11, problematic dysarthria in 7, apraxia of eyelid opening (ALO) in 11, back pain in 10, and restless leg syndrome in 6 patients. Neuropsychiatric complications were transient mood changes in some, impulse control disorder in 2, severe depression related to excessive reduction of dopaminergic medications in 2, rapid progression of dementia in 1, and suicide attempts in 2 patients. Most complications were mild and transient. Dysarthria and ALO were the most frequent permanent sequelae after STN-DBS. Treatment-related adverse events may be caused not only by the effect of stimulation effect but also excessive reduction of dopaminergic medication, or progression of the disease. In conclusion, STN-DBS seems to be a relatively safe procedure. Although serious complications with permanent sequelae are rare, significant incidences of adverse effects occur. Physicians engaged in this treatment should have a comprehensive understanding of the probable complications and how to avoid them.

  7. The use of spatial dose gradients and probability density function to evaluate the effect of internal organ motion for prostate IMRT treatment planning

    NASA Astrophysics Data System (ADS)

    Jiang, Runqing; Barnett, Rob B.; Chow, James C. L.; Chen, Jeff Z. Y.

    2007-03-01

    The aim of this study is to investigate the effects of internal organ motion on IMRT treatment planning of prostate patients using a spatial dose gradient and probability density function. Spatial dose distributions were generated from a Pinnacle3 planning system using a co-planar, five-field intensity modulated radiation therapy (IMRT) technique. Five plans were created for each patient using equally spaced beams but shifting the angular displacement of the beam by 15° increments. Dose profiles taken through the isocentre in anterior-posterior (A-P), right-left (R-L) and superior-inferior (S-I) directions for IMRT plans were analysed by exporting RTOG file data from Pinnacle. The convolution of the 'static' dose distribution D0(x, y, z) and probability density function (PDF), denoted as P(x, y, z), was used to analyse the combined effect of repositioning error and internal organ motion. Organ motion leads to an enlarged beam penumbra. The amount of percentage mean dose deviation (PMDD) depends on the dose gradient and organ motion probability density function. Organ motion dose sensitivity was defined by the rate of change in PMDD with standard deviation of motion PDF and was found to increase with the maximum dose gradient in anterior, posterior, left and right directions. Due to common inferior and superior field borders of the field segments, the sharpest dose gradient will occur in the inferior or both superior and inferior penumbrae. Thus, prostate motion in the S-I direction produces the highest dose difference. The PMDD is within 2.5% when standard deviation is less than 5 mm, but the PMDD is over 2.5% in the inferior direction when standard deviation is higher than 5 mm in the inferior direction. Verification of prostate organ motion in the inferior directions is essential. The margin of the planning target volume (PTV) significantly impacts on the confidence of tumour control probability (TCP) and level of normal tissue complication probability (NTCP). Smaller margins help to reduce the dose to normal tissues, but may compromise the dose coverage of the PTV. Lower rectal NTCP can be achieved by either a smaller margin or a steeper dose gradient between PTV and rectum. With the same DVH control points, the rectum has lower complication in the seven-beam technique used in this study because of the steeper dose gradient between the target volume and rectum. The relationship between dose gradient and rectal complication can be used to evaluate IMRT treatment planning. The dose gradient analysis is a powerful tool to improve IMRT treatment plans and can be used for QA checking of treatment plans for prostate patients.

  8. The use of spatial dose gradients and probability density function to evaluate the effect of internal organ motion for prostate IMRT treatment planning.

    PubMed

    Jiang, Runqing; Barnett, Rob B; Chow, James C L; Chen, Jeff Z Y

    2007-03-07

    The aim of this study is to investigate the effects of internal organ motion on IMRT treatment planning of prostate patients using a spatial dose gradient and probability density function. Spatial dose distributions were generated from a Pinnacle3 planning system using a co-planar, five-field intensity modulated radiation therapy (IMRT) technique. Five plans were created for each patient using equally spaced beams but shifting the angular displacement of the beam by 15 degree increments. Dose profiles taken through the isocentre in anterior-posterior (A-P), right-left (R-L) and superior-inferior (S-I) directions for IMRT plans were analysed by exporting RTOG file data from Pinnacle. The convolution of the 'static' dose distribution D0(x, y, z) and probability density function (PDF), denoted as P(x, y, z), was used to analyse the combined effect of repositioning error and internal organ motion. Organ motion leads to an enlarged beam penumbra. The amount of percentage mean dose deviation (PMDD) depends on the dose gradient and organ motion probability density function. Organ motion dose sensitivity was defined by the rate of change in PMDD with standard deviation of motion PDF and was found to increase with the maximum dose gradient in anterior, posterior, left and right directions. Due to common inferior and superior field borders of the field segments, the sharpest dose gradient will occur in the inferior or both superior and inferior penumbrae. Thus, prostate motion in the S-I direction produces the highest dose difference. The PMDD is within 2.5% when standard deviation is less than 5 mm, but the PMDD is over 2.5% in the inferior direction when standard deviation is higher than 5 mm in the inferior direction. Verification of prostate organ motion in the inferior directions is essential. The margin of the planning target volume (PTV) significantly impacts on the confidence of tumour control probability (TCP) and level of normal tissue complication probability (NTCP). Smaller margins help to reduce the dose to normal tissues, but may compromise the dose coverage of the PTV. Lower rectal NTCP can be achieved by either a smaller margin or a steeper dose gradient between PTV and rectum. With the same DVH control points, the rectum has lower complication in the seven-beam technique used in this study because of the steeper dose gradient between the target volume and rectum. The relationship between dose gradient and rectal complication can be used to evaluate IMRT treatment planning. The dose gradient analysis is a powerful tool to improve IMRT treatment plans and can be used for QA checking of treatment plans for prostate patients.

  9. Highly Efficient Training, Refinement, and Validation of a Knowledge-based Planning Quality-Control System for Radiation Therapy Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Nan; Carmona, Ruben; Sirak, Igor

    Purpose: To demonstrate an efficient method for training and validation of a knowledge-based planning (KBP) system as a radiation therapy clinical trial plan quality-control system. Methods and Materials: We analyzed 86 patients with stage IB through IVA cervical cancer treated with intensity modulated radiation therapy at 2 institutions according to the standards of the INTERTECC (International Evaluation of Radiotherapy Technology Effectiveness in Cervical Cancer, National Clinical Trials Network identifier: 01554397) protocol. The protocol used a planning target volume and 2 primary organs at risk: pelvic bone marrow (PBM) and bowel. Secondary organs at risk were rectum and bladder. Initial unfiltered dose-volumemore » histogram (DVH) estimation models were trained using all 86 plans. Refined training sets were created by removing sub-optimal plans from the unfiltered sample, and DVH estimation models… and DVH estimation models were constructed by identifying 30 of 86 plans emphasizing PBM sparing (comparing protocol-specified dosimetric cutpoints V{sub 10} (percentage volume of PBM receiving at least 10 Gy dose) and V{sub 20} (percentage volume of PBM receiving at least 20 Gy dose) with unfiltered predictions) and another 30 of 86 plans emphasizing bowel sparing (comparing V{sub 40} (absolute volume of bowel receiving at least 40 Gy dose) and V{sub 45} (absolute volume of bowel receiving at least 45 Gy dose), 9 in common with the PBM set). To obtain deliverable KBP plans, refined models must inform patient-specific optimization objectives and/or priorities (an auto-planning “routine”). Four candidate routines emphasizing different tradeoffs were composed, and a script was developed to automatically re-plan multiple patients with each routine. After selection of the routine that best met protocol objectives in the 51-patient training sample (KBP{sub FINAL}), protocol-specific DVH metrics and normal tissue complication probability were compared for original versus KBP{sub FINAL} plans across the 35-patient validation set. Paired t tests were used to test differences between planning sets. Results: KBP{sub FINAL} plans outperformed manual planning across the validation set in all protocol-specific DVH cutpoints. The mean normal tissue complication probability for gastrointestinal toxicity was lower for KBP{sub FINAL} versus validation-set plans (48.7% vs 53.8%, P<.001). Similarly, the estimated mean white blood cell count nadir was higher (2.77 vs 2.49 k/mL, P<.001) with KBP{sub FINAL} plans, indicating lowered probability of hematologic toxicity. Conclusions: This work demonstrates that a KBP system can be efficiently trained and refined for use in radiation therapy clinical trials with minimal effort. This patient-specific plan quality control resulted in improvements on protocol-specific dosimetric endpoints.« less

  10. NAFLD fibrosis score: a prognostic predictor for mortality and liver complications among NAFLD patients.

    PubMed

    Treeprasertsuk, Sombat; Björnsson, Einar; Enders, Felicity; Suwanwalaikorn, Sompongse; Lindor, Keith D

    2013-02-28

    To study whether the severity of liver fibrosis estimated by the nonalcoholic fatty liver disease (NAFLD) fibrosis score can predict all-cause mortality, cardiac complications, and/or liver complications of patients with NAFLD over long-term follow-up. A cohort of well-characterized patients with NAFLD diagnosed during the period of 1980-2000 was identified through the Rochester Epidemiology Project. The NAFLD fibrosis score (NFS) was used to separate NAFLD patients with and without advanced liver fibrosis. We used the NFS score to classify the probability of fibrosis as < -1.5 for low probability, > -1.5 to < 0.67 for intermediate probability, and > 0.67 for high probability. Primary endpoints included all-cause death and cardiovascular- and/or liver-related mortality. From the 479 patients with NAFLD assessed, 302 patients (63%) greater than 18 years old were included. All patients were followed, and medical charts were reviewed until August 31, 2009 or the date when the first primary endpoint occurred. By using a standardized case record form, we recorded a detailed history and physical examination and the use of statins and metformin during the follow-up period. A total of 302/479 (63%) NAFLD patients (mean age: 47 ± 13 year) were included with a follow-up period of 12.0 ± 3.9 year. A low probability of advanced fibrosis (NFS < -1.5 at baseline) was found in 181 patients (60%), while an intermediate or high probability of advanced fibrosis (NSF > -1.5) was found in 121 patients (40%). At the end of the follow-up period, 55 patients (18%) developed primary endpoints. A total of 39 patients (13%) died during the follow-up. The leading causes of death were non-hepatic malignancy (n = 13/39; 33.3%), coronary heart disease (CHD) (n = 8/39; 20.5%), and liver-related mortality (n = 5/39; 12.8%). Thirty patients had new-onset CHD, whereas 8 of 30 patients (27%) died from CHD-related causes during the follow-up. In a multivariate analysis, a higher NFS at baseline and the presence of new-onset CHD were significantly predictive of death (OR = 2.6 and 9.2, respectively; P < 0.0001). Our study showed a significant, graded relationship between the NFS, as classified into 3 subgroups (low, intermediate and high probability of liver fibrosis), and the occurrence of primary endpoints. The use of metformin or simvastatin for at least 3 mo during the follow-up was associated with fewer deaths in patients with NAFLD (OR = 0.2 and 0.03, respectively; P < 0.05). Additionally, the rate of annual NFS change in patients with an intermediate or high probability of advanced liver fibrosis was significantly lower than those patients with a low probability of advanced liver fibrosis (0.06 vs 0.09, P = 0.004). The annual NFS change in patients who died was significantly higher than those in patients who survived (0.14 vs 0.07, P = 0.03). At the end of the follow-up, we classified the patients into 3 subgroups according to the progression pattern of liver fibrosis by comparing the NFS at baseline to the NFS at the end of the follow-up period. Most patients were in the stable-fibrosis (60%) and progressive-fibrosis (37%) groups, whereas only 3% were in the regressive fibrosis. A higher NAFLD fibrosis score at baseline and a new onset of CHD were significantly predictive of death in patients with NAFLD.

  11. A cost-utility analysis of sacral anterior root stimulation (SARS) compared with medical treatment in patients with complete spinal cord injury with a neurogenic bladder.

    PubMed

    Morlière, Camille; Verpillot, Elise; Donon, Laurence; Salmi, Louis-Rachid; Joseph, Pierre-Alain; Vignes, Jean-Rodolphe; Bénard, Antoine

    2015-12-01

    Sacral anterior root stimulation (SARS) and posterior sacral rhizotomy restores the ability to urinate on demand with low residual volumes, which is a key for preventing urinary complications that account for 10% of the causes of death in patients with spinal cord injury with a neurogenic bladder. Nevertheless, comparative cost-effectiveness results on a long time horizon are lacking to adequately inform decisions of reimbursement. This study aimed to estimate the long-term cost-utility of SARS using the Finetech-Brindley device compared with medical treatment (anticholinergics+catheterization). The following study design is used for the paper: Markov model elaborated with a 10-year time horizon; with four irreversible states: (1) initial treatment, (2) year 1 of surgery for urinary complication, (3) year >1 of surgery for urinary complication, and (4) death; and reversible states: urinary calculi; Finetech-Brindley device failures. The sample consisted of theoretical cohorts of patients with a complete spinal cord lesion since ≥1 year, and a neurogenic bladder. Effectiveness was expressed as quality adjusted life years (QALYs). Costs were valued in EUR 2013 in the perspective of the French health system. A systematic review and meta-analyses were performed to estimate transition probabilities and QALYs. Costs were estimated from the literature, and through simulations using the 2013 French prospective payment system classification. Probabilistic analyses were conducted to handle parameter uncertainty. In the base case analysis (2.5% discount rate), the cost-utility ratio was 12,710 EUR per QALY gained. At a threshold of 30,000 EUR per QALY the probability of SARS being cost-effective compared with medical treatment was 60%. If the French Healthcare System reimbursed SARS for 80 patients per year during 10 years (anticipated target population), the expected incremental net health benefit would be 174 QALYs, and the expected value of perfect information (EVPI) would be 4.735 million EUR. The highest partial EVPI is reached for utility values and costs (1.3-1.6 million EUR). Our model shows that SARS using Finetech-Brindley device offers the most important benefit and should be considered cost-effective at a cost-effectiveness threshold of 30,000 EUR per QALY. Despite a high uncertainty, EVPI and partial EVPI may indicate that further research would not be profitable to inform decision-making. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Treatment of benign esophageal stricture by Eder-Puestow or balloon dilators: a comparison between randomized and prospective nonrandomized trials.

    PubMed

    Yamamoto, H; Hughes, R W; Schroeder, K W; Viggiano, T R; DiMagno, E P

    1992-03-01

    To determine whether the natural history of strictures is affected by the type of dilator used to treat newly diagnosed peptic strictures, we designed a prospective randomized trial to compare the results after Eder-Puestow or Medi-Tech balloon dilation. We entered 31 patients into the trial. We also prospectively followed up all 92 nonrandomized patients who underwent their first dilation for a benign stricture during the same period as the prospective randomized trial. The nonrandomized patients also underwent dilation with either the Eder-Puestow or the balloon technique at the discretion of the gastroenterologist performing the endoscopy. We found no statistically significant differences in the immediate or long-term results of the two methods among the randomized, nonrandomized, and overall combined groups. All but 1 of the 123 patients had immediate relief of dysphagia. Within each group of patients, the probability of remaining free of dysphagia 1 year after the initial dilation was approximately 20%, and the probability of not requiring a second dilation was approximately 65% with either technique. Major (esophageal rupture) and minor (bleeding or chest pain) complications occurred in 1% and 5% of the patients and 0.4% and 3% of the total dilation procedures, respectively. The esophageal rupture and four of six minor complications occurred after repeated dilations. Five of the six minor complications occurred with use of the Eder-Puestow dilators. We conclude that Eder-Puestow and balloon dilations of benign esophageal strictures are associated with similar outcomes, but repeated dilations and the Eder-Puestow technique may be associated with an increased risk of complications.

  13. Peripheral nerve block in patients with Ehlers-Danlos syndrome, hypermobility type: a case series.

    PubMed

    Neice, Andrew E; Stubblefield, Eryn E; Woodworth, Glenn E; Aziz, Michael F

    2016-09-01

    Ehlers-Danlos syndrome (EDS) is an inherited disease characterized by defects in various collagens or their post translational modification, with an incidence estimated at 1 in 5000. Performance of peripheral nerve block in patients with EDS is controversial, due to easy bruising and hematoma formation after injections as well as reports of reduced block efficacy. The objective of this study was to review the charts of EDS patients who had received peripheral nerve block for any evidence of complications or reduced efficacy. Case series, chart review. Academic medical center. Patients with a confirmed or probable diagnosis of EDS who had received a peripheral nerve block in the last 3 years were identified by searching our institutions electronic medical record system. The patients were classified by their subtype of EDS. Patients with no diagnosed subtype were given a probable subtype based on a chart review of the patient's symptoms. Patient charts were reviewed for any evidence of complications or reduced block efficacy. A total of 21 regional anesthetics, on 16 unique patients were identified, 10 of which had a EDS subtype diagnosis. The majority of these patients had a diagnosis of hypermobility-type EDS. No block complications were noted in any patients. Two block failures requiring repeat block were noted, and four patients reported uncontrolled pain on postoperative day one despite successful placement of a peripheral nerve catheter. Additionally, blocks were performed without incident in patients with classical-type and vascular-type EDS although the number was so small that no conclusions can be drawn about relative safety of regional anesthesia in these groups. This series fails to show an increased risk of complications of peripheral nerve blockade in patients with hypermobility-type EDS. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Outcomes and Safety of the Combined Abdominoplasty-Hysterectomy: A Preliminary Study.

    PubMed

    Massenburg, Benjamin B; Sanati-Mehrizy, Paymon; Ingargiola, Michael J; Rosa, Jonatan Hernandez; Taub, Peter J

    2015-10-01

    Abdominoplasty (ABP) at the time of hysterectomy (HYS) has been described in the literature since 1986 and is being increasingly requested by patients. However, outcomes of the combined procedure have not been thoroughly explored. The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program database and identified each ABP, HYS, and combined ABP-HYS performed between 2005 and 2012. The incidence of complications in each of the three procedures was calculated, and a multiplicative-risk model was used to calculate the probability of a complication for a patient undergoing distinct HYS and ABP on different dates. One-sample binomial hypothesis tests were performed to determine statistical significance. There were 1325 ABP cases, 12,173 HYS cases, and 143 ABP-HYS cases identified. Surgical complications occurred in 7.7 % of patients undergoing an ABP-HYS, while the calculated risk of a surgical complication was 12.5 % (p = 0.0407) for patients undergoing separate ABP and HYS procedures. The mean operative time was significantly lower for an ABP-HYS at 238 vs. 270 min for separate ABP and HYS procedures (p < 0.0001), and the mean time under anesthesia was significantly lower at 295 vs. 364 min (p < 0.0001). A combined ABP-HYS has a lower incidence of surgical complications than separate ABP and HYS procedures performed on different dates. These data should not encourage all patients to elect a combined ABP-HYS, if only undergoing a HYS, as the combined procedure is associated with increased risks when compared to either isolated individual procedure. However, in patients who are planning on undergoing both procedures on separate dates, a combined ABP-HYS is a safe option that will result in fewer surgical complications, less operative time, less time under anesthesia, and a trend towards fewer days in the hospital. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  15. Targeted vs. systematic early antiviral treatment against A(H1N1)v influenza with neuraminidase inhibitors in patients with influenza-like symptoms: Clinical and economic impact

    PubMed Central

    Deuffic-Burban, Sylvie; Lenne, Xavier; Dervaux, Benoit; Julien Poissy; Lemaire, Xavier; Sloan, Caroline; Carrat, Fabrice; Desenclos, Jean-Claude; Delfraissy, Jean-Francois; Yazdanpanah, Yazdan

    2009-01-01

    Capitalizing on available data, we used a decision model to estimate the clinical and economic outcomes associated with early initiation of treatment with neuraminidase inhibitors in all patients with influenza-like illnesses ( ILI ) (systematic strategy) vs. only those at high risk of complications (targeted strategy). Systematic treatment of ILI during an A(H1N1)v influenza epidemic wave is both effective and cost-effective. Patients who present to care with ILI during an A(H1N1)v influenza epidemic wave should initiate treatment with neuraminidase inhibitors, regardless of risk status. Administering neuraminidase inhibitors between epidemic waves, when the probability of influenza is low, is less effective and cost-effective. PMID:20029659

  16. Chaotic dynamics of large-scale double-diffusive convection in a porous medium

    NASA Astrophysics Data System (ADS)

    Kondo, Shutaro; Gotoda, Hiroshi; Miyano, Takaya; Tokuda, Isao T.

    2018-02-01

    We have studied chaotic dynamics of large-scale double-diffusive convection of a viscoelastic fluid in a porous medium from the viewpoint of dynamical systems theory. A fifth-order nonlinear dynamical system modeling the double-diffusive convection is theoretically obtained by incorporating the Darcy-Brinkman equation into transport equations through a physical dimensionless parameter representing porosity. We clearly show that the chaotic convective motion becomes much more complicated with increasing porosity. The degree of dynamic instability during chaotic convective motion is quantified by two important measures: the network entropy of the degree distribution in the horizontal visibility graph and the Kaplan-Yorke dimension in terms of Lyapunov exponents. We also present an interesting on-off intermittent phenomenon in the probability distribution of time intervals exhibiting nearly complete synchronization.

  17. Complications of modeling glycosylation reactions: can the anomeric conformation of a donor determine the glycopyranosyl oxacarbenium ring conformation?

    PubMed

    Whitfield, Dennis M

    2012-07-15

    That the ring conformation of glycopyranosyl oxacarbenium ions can influence the stereochemical outcome of glycosylation reactions has been postulated for some time. Some new ionization calculations show that the ultimate conformation (4)H(3) or (5)S(1) of D-glucopyranosyl oxacarbenium ions depends on the initial ϕ(H) (CH-1-C-1-S(+)-SCH(3)) conformation of anomeric thiosulfonium ions. Evidence is also presented that nucleophile:electrophile hydrogen bonded complexes, 1,6-anhydro-carbenium ions and electron rich carbon nucleophile:oxacarbenium ion complexes are all probably artifacts of neglecting counter ions or nucleophiles in the DFT calculation. All three cationic species are likely important for glycosylation reaction side reactions but not as productive species. Copyright © 2012. Published by Elsevier Ltd.

  18. Explosion safety in industrial electrostatics

    NASA Astrophysics Data System (ADS)

    Szabó, S. V.; Kiss, I.; Berta, I.

    2011-01-01

    Complicated industrial systems are often endangered by electrostatic hazards, both from atmospheric (lightning phenomenon, primary and secondary lightning protection) and industrial (technological problems caused by static charging and fire and explosion hazards.) According to the classical approach protective methods have to be used in order to remove electrostatic charging and to avoid damages, however no attempt to compute the risk before and after applying the protective method is made, relying instead on well-educated and practiced expertise. The Budapest School of Electrostatics - in close cooperation with industrial partners - develops new suitable solutions for probability based decision support (Static Control Up-to-date Technology, SCOUT) using soft computing methods. This new approach can be used to assess and audit existing systems and - using the predictive power of the models - to design and plan activities in industrial electrostatics.

  19. Worst case encoder-decoder policies for a communication system in the presence of an unknown probabilistic jammer

    NASA Astrophysics Data System (ADS)

    Cascio, David M.

    1988-05-01

    States of nature or observed data are often stochastically modelled as Gaussian random variables. At times it is desirable to transmit this information from a source to a destination with minimal distortion. Complicating this objective is the possible presence of an adversary attempting to disrupt this communication. In this report, solutions are provided to a class of minimax and maximin decision problems, which involve the transmission of a Gaussian random variable over a communications channel corrupted by both additive Gaussian noise and probabilistic jamming noise. The jamming noise is termed probabilistic in the sense that with nonzero probability 1-P, the jamming noise is prevented from corrupting the channel. We shall seek to obtain optimal linear encoder-decoder policies which minimize given quadratic distortion measures.

  20. A Multidimensional Analysis of Prostate Surgery Costs in the United States: Robotic-Assisted versus Retropubic Radical Prostatectomy.

    PubMed

    Bijlani, Akash; Hebert, April E; Davitian, Mike; May, Holly; Speers, Mark; Leung, Robert; Mohamed, Nihal E; Sacks, Henry S; Tewari, Ashutosh

    2016-06-01

    The economic value of robotic-assisted laparoscopic prostatectomy (RALP) in the United States is still not well understood because of limited view analyses. The objective of this study was to examine the costs and benefits of RALP versus retropubic radical prostatectomy from an expanded view, including hospital, payer, and societal perspectives. We performed a model-based cost comparison using clinical outcomes obtained from a systematic review of the published literature. Equipment costs were obtained from the manufacturer of the robotic system; other economic model parameters were obtained from government agencies, online resources, commercially available databases, an advisory expert panel, and the literature. Clinical point estimates and care pathways based on National Comprehensive Cancer Network guidelines were used to model costs out to 3 years. Hospital costs and costs incurred for the patients' postdischarge complications, adjuvant and salvage radiation treatment, incontinence and potency treatment, and lost wages during recovery were considered. Robotic system costs were modeled in two ways: as hospital overhead (hospital overhead calculation: RALP-H) and as a function of robotic case volume (robotic amortization calculation: RALP-R). All costs were adjusted to year 2014 US dollars. Because of more favorable clinical outcomes over 3 years, RALP provided hospital ($1094 savings with RALP-H, $341 deficit with RALP-R), payer ($1451), and societal ($1202) economic benefits relative to retropubic radical prostatectomy. Monte-Carlo probabilistic sensitivity analysis demonstrated a 38% to 99% probability that RALP provides cost savings (depending on the perspective). Higher surgical consumable costs are offset by a decreased hospital stay, lower complication rate, and faster return to work. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. An NMR relaxometry and gravimetric study of gelatin-free aqueous polyacrylamide dosimeters

    NASA Astrophysics Data System (ADS)

    Babic, Steven; Schreiner, L. John

    2006-09-01

    In conformal radiation therapy, a high dose of radiation is given to a target volume to increase the probability of cure, and care is taken to minimize the dose to surrounding healthy tissue. The techniques used to achieve this are very complicated and the precise verification of the resulting three-dimensional (3D) dose distribution is required. Polyacrylamide gelatin (PAG) dosimeters with magnetic resonance imaging and optical computed tomography scanning provide the required 3D dosimetry with high spatial resolution. Many basic studies have characterized these chemical dosimeters that polymerize under irradiation. However, the investigation of the fundamental properties of the radiation-induced polymerization in PAG dosimeters is complicated by the presence of the background gelatin matrix. In this work, a gelatin-free model system for the study of the basic radiation-induced polymerization in PAG dosimeters has been developed. Experiments were performed on gelatin-free dosimeters, named aqueous polyacrylamide (APA) dosimeters, containing equal amounts of acrylamide and N,N'-methylene-bisacrylamide. The APA dosimeters were prepared with four different total monomer concentrations (2, 4, 6 and 8% by weight). Nuclear magnetic resonance (NMR) spin-spin and spin-lattice proton relaxation measurements at 20 MHz, and gravimetric analyses performed on all four dosimeters, show a continuous degree of polymerization over the dose range of 0-25 Gy. The developed NMR model explains the relationship observed between the relaxation data and the amount of crosslinked polymer formed at each dose. This model can be extended with gelatin relaxation data to provide a fundamental understanding of radiation-induced polymerization in the conventional PAG dosimeters.

  2. An NMR relaxometry and gravimetric study of gelatin-free aqueous polyacrylamide dosimeters.

    PubMed

    Babic, Steven; Schreiner, L John

    2006-09-07

    In conformal radiation therapy, a high dose of radiation is given to a target volume to increase the probability of cure, and care is taken to minimize the dose to surrounding healthy tissue. The techniques used to achieve this are very complicated and the precise verification of the resulting three-dimensional (3D) dose distribution is required. Polyacrylamide gelatin (PAG) dosimeters with magnetic resonance imaging and optical computed tomography scanning provide the required 3D dosimetry with high spatial resolution. Many basic studies have characterized these chemical dosimeters that polymerize under irradiation. However, the investigation of the fundamental properties of the radiation-induced polymerization in PAG dosimeters is complicated by the presence of the background gelatin matrix. In this work, a gelatin-free model system for the study of the basic radiation-induced polymerization in PAG dosimeters has been developed. Experiments were performed on gelatin-free dosimeters, named aqueous polyacrylamide (APA) dosimeters, containing equal amounts of acrylamide and N,N'-methylene-bisacrylamide. The APA dosimeters were prepared with four different total monomer concentrations (2, 4, 6 and 8% by weight). Nuclear magnetic resonance (NMR) spin-spin and spin-lattice proton relaxation measurements at 20 MHz, and gravimetric analyses performed on all four dosimeters, show a continuous degree of polymerization over the dose range of 0-25 Gy. The developed NMR model explains the relationship observed between the relaxation data and the amount of crosslinked polymer formed at each dose. This model can be extended with gelatin relaxation data to provide a fundamental understanding of radiation-induced polymerization in the conventional PAG dosimeters.

  3. Acute soft head syndrome in children with sickle cell anaemia in lagos, Nigeria.

    PubMed

    Akodu, Samuel Olufemi; Njokanma, Olisamedua Fidelis; Diaku-Akinwumi, Ijeoma Nnenna; Ubuane, Peter Odion; Adediji, Uchechukwu Okwudili

    2014-09-01

    Acute soft head syndrome is rare complications seen in children with sickle cell anaemia. A case report of a child with sickle cell anaemia who developed acute soft head syndrome. A 12-year old known sickle cell anaemia patient presented with acute, rapidly progressive skull pain and swelling, manifestations indicative of the rare complication of SCD which is called acute soft head syndrome. Conservative treatment with intravenous fluids and analgesics and empirical use of broad-spectrum antibiotics resulted in recovery. Acute soft head syndrome is a rare complication in children with sickle cell anaemia probably related to skull infarction. It further draws attention to the importance of acute soft head syndrome as a differential to be considered for pains in the head and skull swellings in children with sickle cell anaemia.

  4. Thrombolytic Therapy by Tissue Plasminogen Activator for Pulmonary Embolism.

    PubMed

    Islam, Md Shahidul

    2017-01-01

    Clinicians need to make decisions about the use of thrombolytic (fibrinolytic) therapy for pulmonary embolism (PE) after carefully considering the risks of major complications from bleeding, and the benefits of treatment, for each individual patient. They should probably not use systemic thrombolysis for PE patients with normal blood pressure. Treatment by human recombinant tissue plasminogen activator (rt-PA), alteplase, saves the lives of high-risk PE patients, that is, those with hypotension or in shock. Even in the absence of strong evidence, clinicians need to choose the most appropriate regimen for administering alteplase for individual patients, based on assessment of the urgency of the situation, risks for major complications from bleeding, and patient's body weight. In addition, invasive strategies should be considered when absolute contraindications for thrombolytic therapy exist, serious complications arise, or thrombolytic therapy fails.

  5. A Validated Prediction Model for Overall Survival From Stage III Non-Small Cell Lung Cancer: Toward Survival Prediction for Individual Patients.

    PubMed

    Oberije, Cary; De Ruysscher, Dirk; Houben, Ruud; van de Heuvel, Michel; Uyterlinde, Wilma; Deasy, Joseph O; Belderbos, Jose; Dingemans, Anne-Marie C; Rimner, Andreas; Din, Shaun; Lambin, Philippe

    2015-07-15

    Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability (www.predictcancer.org). The data set can be downloaded at https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048. The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Comparison of different fractionation schedules toward a single fraction in high-dose-rate brachytherapy as monotherapy for low-risk prostate cancer using 3-dimensional radiobiological models.

    PubMed

    Mavroidis, Panayiotis; Milickovic, Natasa; Cruz, Wilbert F; Tselis, Nikolaos; Karabis, Andreas; Stathakis, Sotirios; Papanikolaou, Nikos; Zamboglou, Nikolaos; Baltas, Dimos

    2014-01-01

    The aim of the present study was the investigation of different fractionation schemes to estimate their clinical impact. For this purpose, widely applied radiobiological models and dosimetric measures were used to associate their results with clinical findings. The dose distributions of 12 clinical high-dose-rate brachytherapy implants for prostate were evaluated in relation to different fractionation schemes. The fractionation schemes compared were: (1) 1 fraction of 20 Gy; (2) 2 fractions of 14 Gy; (3) 3 fractions of 11 Gy; and (4) 4 fractions of 9.5 Gy. The clinical effectiveness of the different fractionation schemes was estimated through the complication-free tumor control probability (P+), the biologically effective uniform dose, and the generalized equivalent uniform dose index. For the different fractionation schemes, the tumor control probabilities were 98.5% in 1×20 Gy, 98.6% in 2×14 Gy, 97.5% in 3×11 Gy, and 97.8% in 4×9.5 Gy. The corresponding P+ values were 88.8% in 1×20 Gy, 83.9% in 2×14 Gy, 86.0% in 3×11 Gy, and 82.3% in 4×9.5 Gy. With use of the fractionation scheme 4×9.5 Gy as reference, the isoeffective schemes regarding tumor control for 1, 2, and 3 fractions were 1×19.68 Gy, 2×13.75 Gy, and 3×11.05 Gy. The optimum fractionation schemes for 1, 2, 3, and 4 fractions were 1×19.16 Gy with a P+ of 91.8%, 2×13.2 Gy with a P+ of 89.6%, 3×10.6 Gy with a P+ of 88.4%, and 4×9.02 Gy with a P+ of 86.9%. Among the fractionation schemes 1×20 Gy, 2×14 Gy, 3×11 Gy, and 4×9.5 Gy, the first scheme was more effective in terms of P+. After performance of a radiobiological optimization, it was shown that a single fraction of 19.2 to 19.7 Gy (average 19.5 Gy) should produce at least the same benefit as that given by the 4×9.5 Gy scheme, and it should reduce the expected total complication probability by approximately 40% to 55%. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Comparison of Different Fractionation Schedules Toward a Single Fraction in High-Dose-Rate Brachytherapy as Monotherapy for Low-Risk Prostate Cancer Using 3-Dimensional Radiobiological Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mavroidis, Panayiotis, E-mail: mavroidis@uthscsa.edu; Department of Medical Radiation Physics, Karolinska Institutet and Stockholm University, Stockholm; Milickovic, Natasa

    2014-01-01

    Purpose: The aim of the present study was the investigation of different fractionation schemes to estimate their clinical impact. For this purpose, widely applied radiobiological models and dosimetric measures were used to associate their results with clinical findings. Methods and Materials: The dose distributions of 12 clinical high-dose-rate brachytherapy implants for prostate were evaluated in relation to different fractionation schemes. The fractionation schemes compared were: (1) 1 fraction of 20 Gy; (2) 2 fractions of 14 Gy; (3) 3 fractions of 11 Gy; and (4) 4 fractions of 9.5 Gy. The clinical effectiveness of the different fractionation schemes was estimatedmore » through the complication-free tumor control probability (P{sub +}), the biologically effective uniform dose, and the generalized equivalent uniform dose index. Results: For the different fractionation schemes, the tumor control probabilities were 98.5% in 1 × 20 Gy, 98.6% in 2 × 14 Gy, 97.5% in 3 × 11 Gy, and 97.8% in 4 × 9.5 Gy. The corresponding P{sub +} values were 88.8% in 1 × 20 Gy, 83.9% in 2 × 14 Gy, 86.0% in 3 × 11 Gy, and 82.3% in 4 × 9.5 Gy. With use of the fractionation scheme 4 × 9.5 Gy as reference, the isoeffective schemes regarding tumor control for 1, 2, and 3 fractions were 1 × 19.68 Gy, 2 × 13.75 Gy, and 3 × 11.05 Gy. The optimum fractionation schemes for 1, 2, 3, and 4 fractions were 1 × 19.16 Gy with a P{sub +} of 91.8%, 2 × 13.2 Gy with a P{sub +} of 89.6%, 3 × 10.6 Gy with a P{sub +} of 88.4%, and 4 × 9.02 Gy with a P{sub +} of 86.9%. Conclusions: Among the fractionation schemes 1 × 20 Gy, 2 × 14 Gy, 3 × 11 Gy, and 4 × 9.5 Gy, the first scheme was more effective in terms of P{sub +}. After performance of a radiobiological optimization, it was shown that a single fraction of 19.2 to 19.7 Gy (average 19.5 Gy) should produce at least the same benefit as that given by the 4 × 9.5 Gy scheme, and it should reduce the expected total complication probability by approximately 40% to 55%.« less

  8. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  9. Arrhythmogenic right ventricular cardiomyopathy in monozygotic twin sisters, and persistent left superior vena cava in one complicating implantation of ICD.

    PubMed

    Astarcıoğlu, Mehmet Ali; Yaymacı, Mehmet; Şen, Taner; Kilit, Celal; Amasyalı, Basri

    2015-10-01

    Arrhythmogenic right ventricular cardiomyopathy (ARVC) is an inherited cardiomyopathy characterized histologically by fibro-fatty replacement of heart muscle, and clinically by ventricular arrhythmias and right ventricular dysfunction. This report presents monozygotic twins with ARVC, suggesting a genetic abnormality as the most probable cause.

  10. Stacking Oxygen-Separation Cells

    NASA Technical Reports Server (NTRS)

    Schroeder, James E.

    1991-01-01

    Simplified configuration and procedure developed for assembly of stacks of solid-electrolyte cells separating oxygen from air electrochemically. Reduces number of components and thus reduces probability of such failures as gas leaks, breakdown of sensitive parts, and electrical open or short circuits. Previous, more complicated version of cell described in "Improved Zirconia Oxygen-Separation Cell" (NPO-16161).

  11. Irradiation of the inguinal lymph nodes in patients of differing body habitus: a comparison of techniques and resulting normal tissue complication probabilities.

    PubMed

    Brown, Paul D; Kline, Robert W; Petersen, Ivy A; Haddock, Michael G

    2004-01-01

    The treatment of the inguinal lymph nodes with radiotherapy is strongly influenced by the body habitus of the patient. The effect of 7 radiotherapy techniques on femoral head doses was studied. Three female patients of differing body habitus (ectomorph, mesomorph, endomorph) were selected. Radiation fields included the pelvis and contiguous inguinal regions and were representative of fields used in the treatment of cancers of the lower pelvis. Seven treatment techniques were compared. In the ectomorph and mesomorph, normal tissue complication probability (NTCP) for the femoral heads was lowest with use of anteroposterior (AP) and modified posteroanterior (PA) field with inguinal electron field supplements (technique 1). In the endomorph, NTCP was lowest with use of AP and modified PA field without electron field supplements (technique 2) or a 4-field approach (technique 6). Technique 1 for ectomorphs and mesomorphs and techniques 2 and 6 for endomorphs were optimal techniques for providing relatively homogeneous dose distributions within the target area while minimizing the dose to the femoral heads.

  12. [Survival in patients with liver cirrhosis at the Durango, IMSS Regional General Hospital].

    PubMed

    Rodríguez-Hernández, Heriberto; Jacobo-Karam, Janett S; Castañón-Santillán, María del Carmen; Arámbula-Chávez, Mayela; Martínez-Aguilar, Gerardo

    2002-01-01

    In Mexico, hepatic cirrhosis mortality exhibits important regional differences. To analyze global survival of cirrhotic patients, according to etiology and functional status. Between March 1990 to August 1998, newly diagnosed patients with hepatic cirrhosis were included in a follow-up study. Subjects were analyzed monthly. Information on clinical evolution, complications, and dates of events (death) and complications were registered. Survival was estimated using Kaplan-Meier method. Ninety nine subjects were included in the survival analysis, 66 with alcoholic and 33 with viral cirrhosis (HCV and HBV in 24 and nine patients, respectively). Ninety seven percent of patients were decompensated at diagnosis, and 81% had ascites. Probabilities for survival in the entire series were 69.7, 37.6 and 23.6% at 24, 48, and 60 months, respectively. There were no significant differences in the survival of patients grouped according to etiology. When survival was analyzed by Child-Pugh score, it was slightly higher in the alcoholic cirrhosis group. In this study survival probability of patients with viral cirrhosis was lower than in patients with alcohol cirrhosis.

  13. Probabilities of Dilating Vesicoureteral Reflux in Children with First Time Simple Febrile Urinary Tract Infection, and Normal Renal and Bladder Ultrasound.

    PubMed

    Rianthavorn, Pornpimol; Tangngamsakul, Onjira

    2016-11-01

    We evaluated risk factors and assessed predicted probabilities for grade III or higher vesicoureteral reflux (dilating reflux) in children with a first simple febrile urinary tract infection and normal renal and bladder ultrasound. Data for 167 children 2 to 72 months old with a first febrile urinary tract infection and normal ultrasound were compared between those who had dilating vesicoureteral reflux (12 patients, 7.2%) and those who did not. Exclusion criteria consisted of history of prenatal hydronephrosis or familial reflux and complicated urinary tract infection. The logistic regression model was used to identify independent variables associated with dilating reflux. Predicted probabilities for dilating reflux were assessed. Patient age and prevalence of nonEscherichia coli bacteria were greater in children who had dilating reflux compared to those who did not (p = 0.02 and p = 0.004, respectively). Gender distribution was similar between the 2 groups (p = 0.08). In multivariate analysis older age and nonE. coli bacteria independently predicted dilating reflux, with odds ratios of 1.04 (95% CI 1.01-1.07, p = 0.02) and 3.76 (95% CI 1.05-13.39, p = 0.04), respectively. The impact of nonE. coli bacteria on predicted probabilities of dilating reflux increased with patient age. We support the concept of selective voiding cystourethrogram in children with a first simple febrile urinary tract infection and normal ultrasound. Voiding cystourethrogram should be considered in children with late onset urinary tract infection due to nonE. coli bacteria since they are at risk for dilating reflux even if the ultrasound is normal. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. A probability score for preoperative prediction of type 2 diabetes remission following RYGB surgery

    PubMed Central

    Still, Christopher D.; Wood, G. Craig; Benotti, Peter; Petrick, Anthony T.; Gabrielsen, Jon; Strodel, William E.; Ibele, Anna; Seiler, Jamie; Irving, Brian A.; Celaya, Melisa P.; Blackstone, Robin; Gerhard, Glenn S.; Argyropoulos, George

    2014-01-01

    BACKGROUND Type 2 diabetes (T2D) is a metabolic disease with significant medical complications. Roux-en-Y gastric bypass (RYGB) surgery is one of the few interventions that remit T2D in ~60% of patients. However, there is no accurate method for predicting preoperatively the probability for T2D remission. METHODS A retrospective cohort of 2,300 RYGB patients at Geisinger Clinic was used to identify 690 patients with T2D and complete electronic data. Two additional T2D cohorts (N=276, and N=113) were used for replication at 14 months following RYGB. Kaplan-Meier analysis was used in the primary cohort to create survival curves until remission. A Cox proportional hazards model was used to estimate the hazard ratios on T2D remission. FINDINGS Using 259 preoperative clinical variables, four (use of insulin, age, HbA1c, and type of antidiabetic medication) were sufficient to develop an algorithm that produces a type 2 diabetes remission (DiaRem) score over five years. The DiaRem score spans from 0 to 22 and was divided into five groups corresponding to five probability-ranges for T2D remission: 0–2 (88%–99%), 3–7 (64%–88%), 8–12 (23%–49%), 13–17 (11%–33%), 18–22 (2%–16%). The DiaRem scores in the replication cohorts, as well as under various definitions of diabetes remission, conformed to the DiaRem score of the primary cohort. INTERPRETATION The DiaRem score is a novel preoperative method for predicting the probability (from 2% to 99%) for T2D remission following RYGB surgery. FUNDING This research was supported by the Geisinger Health System and the National Institutes of Health. PMID:24579062

  15. Cost effectiveness of alternative imaging strategies for the diagnosis of small-bowel Crohn's disease.

    PubMed

    Levesque, Barrett G; Cipriano, Lauren E; Chang, Steven L; Lee, Keane K; Owens, Douglas K; Garber, Alan M

    2010-03-01

    The cost effectiveness of alternative approaches to the diagnosis of small-bowel Crohn's disease is unknown. This study evaluates whether computed tomographic enterography (CTE) is a cost-effective alternative to small-bowel follow-through (SBFT) and whether capsule endoscopy is a cost-effective third test in patients in whom a high suspicion of disease remains after 2 previous negative tests. A decision-analytic model was developed to compare the lifetime costs and benefits of each diagnostic strategy. Patients were considered with low (20%) and high (75%) pretest probability of small-bowel Crohn's disease. Effectiveness was measured in quality-adjusted life-years (QALYs) gained. Parameter assumptions were tested with sensitivity analyses. With a moderate to high pretest probability of small-bowel Crohn's disease, and a higher likelihood of isolated jejunal disease, follow-up evaluation with CTE has an incremental cost-effectiveness ratio of less than $54,000/QALY-gained compared with SBFT. The addition of capsule endoscopy after ileocolonoscopy and negative CTE or SBFT costs greater than $500,000 per QALY-gained in all scenarios. Results were not sensitive to costs of tests or complications but were sensitive to test accuracies. The cost effectiveness of strategies depends critically on the pretest probability of Crohn's disease and if the terminal ileum is examined at ileocolonoscopy. CTE is a cost-effective alternative to SBFT in patients with moderate to high suspicion of small-bowel Crohn's disease. The addition of capsule endoscopy as a third test is not a cost-effective third test, even in patients with high pretest probability of disease. Copyright 2010 AGA Institute. Published by Elsevier Inc. All rights reserved.

  16. Do People Taking Flu Vaccines Need Them the Most?

    PubMed Central

    Gu, Qian; Sood, Neeraj

    2011-01-01

    Background A well targeted flu vaccine strategy can ensure that vaccines go to those who are at the highest risk of getting infected if unvaccinated. However, prior research has not explicitly examined the association between the risk of flu infection and vaccination rates. Purpose This study examines the relationship between the risk of flu infection and the probability of getting vaccinated. Methods Nationally representative data from the US and multivariate regression models were used to estimate what individual characteristics are associated with (1) the risk of flu infection when unvaccinated and (2) flu vaccination rates. These results were used to estimate the correlation between the probability of infection and the probability of getting vaccinated. Separate analyses were performed for the general population and the high priority population that is at increased risk of flu related complications. Results We find that the high priority population was more likely to get vaccinated compared to the general population. However, within both the high priority and general populations the risk of flu infection when unvaccinated was negatively correlated with vaccination rates (r = −0.067, p<0.01). This negative association between the risk of infection when unvaccinated and the probability of vaccination was stronger for the high priority population (r = −0.361, p<0.01). Conclusions There is a poor match between those who get flu vaccines and those who have a high risk of flu infection within both the high priority and general populations. Targeting vaccination to people with low socioeconomic status, people who are engaged in unhealthy behaviors, working people, and families with kids will likely improve effectiveness of flu vaccine policy. PMID:22164202

  17. Complications of Microwave Ablation for Liver Tumors: Results of a Multicenter Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livraghi, Tito, E-mail: lalivra@tin.it; Meloni, Franca, E-mail: meloni@yahoo.it; Solbiati, Luigi, E-mail: lusolbia@tin.it

    2012-08-15

    Purpose: New technologies for microwave ablation (MWA) have been conceived, designed to achieve larger areas of necrosis compared with radiofrequency ablation (RFA). The purpose of this study was to report complications by using this technique in patients with focal liver cancer. Methods: Members of 14 Italian centers used a 2.45-GMHz generator delivering energy through a cooled miniature-choke MW antenna and a standardized protocol for follow-up. They completed a questionnaire regarding number and type of deaths, major and minor complications and side effects, and likelihood of their relationship to the procedure. Enrollment included 736 patients with 1.037 lesions: 522 had hepatocellularmore » carcinoma with cirrhosis, 187 had metastases predominantly from colorectal cancer, and 27 had cholangiocellular carcinoma. Tumor size ranged from 0.5 to 10 cm. In 13 centers, the approach used was percutaneous, in 4 videolaparoscopic, and in 3 laparotomic. Results: No deaths were reported. Major complications occurred in 22 cases (2.9%), and minor complications in 54 patients (7.3%). Complications of MWA do not differ from those RFA, both being based on the heat damage. Conclusion: Results of this multicenter study confirmed those of single-center experiences, indicating that MWA is a safe procedure, with no mortality and a low rate of major complications. The low rate of complications was probably due to precautions adopted, knowing in advance possible risk conditions, on the basis of prior RFA experience.« less

  18. A stochastic model for early placental development†

    PubMed Central

    Cotter, Simon L.; Klika, Václav; Kimpton, Laura; Collins, Sally; Heazell, Alexander E. P.

    2014-01-01

    In the human, placental structure is closely related to placental function and consequent pregnancy outcome. Studies have noted abnormal placental shape in small-for-gestational-age infants which extends to increased lifetime risk of cardiovascular disease. The origins and determinants of placental shape are incompletely understood and are difficult to study in vivo. In this paper, we model the early development of the human placenta, based on the hypothesis that this is driven by a chemoattractant effect emanating from proximal spiral arteries in the decidua. We derive and explore a two-dimensional stochastic model, and investigate the effects of loss of spiral arteries in regions near to the cord insertion on the shape of the placenta. This model demonstrates that disruption of spiral arteries can exert profound effects on placental shape, particularly if this is close to the cord insertion. Thus, placental shape reflects the underlying maternal vascular bed. Abnormal placental shape may reflect an abnormal uterine environment, predisposing to pregnancy complications. Through statistical analysis of model placentas, we are able to characterize the probability that a given placenta grew in a disrupted environment, and even able to distinguish between different disruptions. PMID:24850904

  19. Hierarchical models and Bayesian analysis of bird survey information

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.; Royle, J. Andrew; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline when trend is poorly estimated. Aggregation of population information among regions is also complicated by varying quality of estimates among regions. Hierarchical models provide a reasonable means of accommodating concerns about aggregation and ranking of quantities of varying precision. In these models the need to consider multiple scales is accommodated by placing distributional assumptions on collections of parameters. For collections of species trends, this allows probability statements to be made about the collections of species-specific parameters, rather than about the estimates. We define and illustrate hierarchical models for two commonly encountered situations in bird conservation: (1) Estimating attributes of collections of species estimates, including ranking of trends, estimating number of species with increasing populations, and assessing population stability with regard to predefined trend magnitudes; and (2) estimation of regional population change, aggregating information from bird surveys over strata. User-friendly computer software makes hierarchical models readily accessible to scientists.

  20. Practice parameter update: management issues for women with epilepsy--focus on pregnancy (an evidence-based review): obstetrical complications and change in seizure frequency: report of the Quality Standards Subcommittee and Therapeutics and Technology Assessment Subcommittee of the American Academy of Neurology and American Epilepsy Society.

    PubMed

    Harden, C L; Hopp, J; Ting, T Y; Pennell, P B; French, J A; Hauser, W A; Wiebe, S; Gronseth, G S; Thurman, D; Meador, K J; Koppel, B S; Kaplan, P W; Robinson, J N; Gidal, B; Hovinga, C A; Wilner, A N; Vazquez, B; Holmes, L; Krumholz, A; Finnell, R; Le Guen, C

    2009-07-14

    To reassess the evidence for management issues related to the care of women with epilepsy (WWE) during pregnancy, including the risk of pregnancy complications or other medical problems during pregnancy in WWE compared to other women, change in seizure frequency, the risk of status epilepticus, and the rate of remaining seizure-free during pregnancy. A 20-member committee including general neurologists, epileptologists, and doctors in pharmacy evaluated the available evidence based on a structured literature review and classification of relevant articles published between 1985 and February 2008. For WWE taking antiepileptic drugs, there is probably no substantially increased risk (greater than two times expected) of cesarean delivery or late pregnancy bleeding, and probably no moderately increased risk (greater than 1.5 times expected) of premature contractions or premature labor and delivery. There is possibly a substantially increased risk of premature contractions and premature labor and delivery during pregnancy for WWE who smoke. Seizure freedom for at least 9 months prior to pregnancy is probably associated with a high likelihood (84%-92%) of remaining seizure-free during pregnancy. Women with epilepsy (WWE) should be counseled that seizure freedom for at least 9 months prior to pregnancy is probably associated with a high rate (84%-92%) of remaining seizure-free during pregnancy (Level B). However, WWE who smoke should be counseled that they possibly have a substantially increased risk of premature contractions and premature labor and delivery during pregnancy (Level C).

  1. A New Insight into the Earthquake Recurrence Studies from the Three-parameter Generalized Exponential Distributions

    NASA Astrophysics Data System (ADS)

    Pasari, S.; Kundu, D.; Dikshit, O.

    2012-12-01

    Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Exponential, gamma, Weibull and lognormal distributions are quite established probability models in this recurrence interval estimation. However, they have certain shortcomings too. Thus, it is imperative to search for some alternative sophisticated distributions. In this paper, we introduce a three-parameter (location, scale and shape) exponentiated exponential distribution and investigate the scope of this distribution as an alternative of the afore-mentioned distributions in earthquake recurrence studies. This distribution is a particular member of the exponentiated Weibull distribution. Despite of its complicated form, it is widely accepted in medical and biological applications. Furthermore, it shares many physical properties with gamma and Weibull family. Unlike gamma distribution, the hazard function of generalized exponential distribution can be easily computed even if the shape parameter is not an integer. To contemplate the plausibility of this model, a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (20-32 deg N and 87-100 deg E) has been used. The model parameters are estimated using maximum likelihood estimator (MLE) and method of moment estimator (MOME). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.

  2. Tracing compartment exchange by NMR diffusometry: Water in lithium-exchanged low-silica X zeolites

    NASA Astrophysics Data System (ADS)

    Lauerer, A.; Kurzhals, R.; Toufar, H.; Freude, D.; Kärger, J.

    2018-04-01

    The two-region model for analyzing signal attenuation in pulsed field gradient (PFG) NMR diffusion studies with molecules in compartmented media implies that, on their trajectory, molecules get from one region (one type of compartment) into the other one with a constant (i.e. a time-invariant) probability. This pattern has proved to serve as a good approach for considering guest diffusion in beds of nanoporous host materials, with the two regions ("compartments") identified as the intra- and intercrystalline pore spaces. It is obvious, however, that the requirements of the application of the two-region model are not strictly fulfilled given the correlation between the covered diffusion path lengths in the intracrystalline pore space and the probability of molecular "escape" from the individual crystallites. On considering water diffusion in lithium-exchanged low-silica X zeolite, we are now assuming a different position since this type of material is known to offer "traps" in the trajectories of the water molecules. Now, on attributing the water molecules in the traps and outside of the traps to these two types of regions, we perfectly comply with the requirements of the two-region model. We do, moreover, benefit from the option of high-resolution measurements owing to the combination of magic angle spinning (MAS) with PFG NMR. Data analysis via the two-region model under inclusion of the influence of nuclear magnetic relaxation yields satisfactory agreement between experimental evidence and theoretical estimates. Limitations in accuracy are shown to result from the fact that mass transfer outside of the traps is too complicated for being adequately reflected by simple Fick's laws with but one diffusivity.

  3. Clinicians' perceptions of the value of ventilation-perfusion scans.

    PubMed

    Siegel, Alan; Holtzman, Stephen R; Bettmann, Michael A; Black, William C

    2004-07-01

    The goal of this investigation was to understand clinicians' perceptions of the probability of pulmonary embolism as a function of V/Q scan results of normal, low, intermediate, and high probability. A questionnaire was developed and distributed to 429 clinicians at a single academic medical center. The response rate was 44% (188 of 429). The questions included level of training, specialty, probability of PE given 1 of the 4 V/Q scan results, and estimations of the charges for V/Q scanning and pulmonary angiography, and estimations of the risks of pulmonary angiography. The medians and ranges for the probability of pulmonary embolism given a normal, low, intermediate, and high probability V/Q scan result were 2.5% (0-30), 12.5% (0.5-52.5), 41.25% (5-75), and 85% (5-100), respectively. Eleven percent (21 of 188) of the respondents listed the probability of PE in patients with a low probability V/Q scan as being 5% or less, and 33% (62 of 188) listed the probability of PE given an intermediate probability scan as 50% or greater. The majority correctly identified the rate of serious complications of pulmonary arteriography, but many respondents underestimated the charge for V/Q scans and pulmonary arteriography. A substantial minority of clinicians do not understand the probability of pulmonary embolism in patients with low and intermediate probability ventilation-perfusion scans. More quantitative reporting of results is recommended. This could be particularly important because VQ scans are used less frequently but are still needed in certain clinical situations.

  4. On the use of biomathematical models in patient-specific IMRT dose QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhen Heming; Nelms, Benjamin E.; Tome, Wolfgang A.

    2013-07-15

    Purpose: To investigate the use of biomathematical models such as tumor control probability (TCP) and normal tissue complication probability (NTCP) as new quality assurance (QA) metrics.Methods: Five different types of error (MLC transmission, MLC penumbra, MLC tongue and groove, machine output, and MLC position) were intentionally induced to 40 clinical intensity modulated radiation therapy (IMRT) patient plans (20 H and N cases and 20 prostate cases) to simulate both treatment planning system errors and machine delivery errors in the IMRT QA process. The changes in TCP and NTCP for eight different anatomic structures (H and N: CTV, GTV, both parotids,more » spinal cord, larynx; prostate: CTV, rectal wall) were calculated as the new QA metrics to quantify the clinical impact on patients. The correlation between the change in TCP/NTCP and the change in selected DVH values was also evaluated. The relation between TCP/NTCP change and the characteristics of the TCP/NTCP curves is discussed.Results:{Delta}TCP and {Delta}NTCP were summarized for each type of induced error and each structure. The changes/degradations in TCP and NTCP caused by the errors vary widely depending on dose patterns unique to each plan, and are good indicators of each plan's 'robustness' to that type of error.Conclusions: In this in silico QA study the authors have demonstrated the possibility of using biomathematical models not only as patient-specific QA metrics but also as objective indicators that quantify, pretreatment, a plan's robustness with respect to possible error types.« less

  5. A new routing enhancement scheme based on node blocking state advertisement in wavelength-routed WDM networks

    NASA Astrophysics Data System (ADS)

    Hu, Peigang; Jin, Yaohui; Zhang, Chunlei; He, Hao; Hu, WeiSheng

    2005-02-01

    The increasing switching capacity brings the optical node with considerable complexity. Due to the limitation in cost and technology, an optical node is often designed with partial switching capability and partial resource sharing. It means that the node is of blocking to some extent, for example multi-granularity switching node, which in fact is a structure using pass wavelength to reduce the dimension of OXC, and partial sharing wavelength converter (WC) OXC. It is conceivable that these blocking nodes will have great effects on the problem of routing and wavelength assignment. Some previous works studied the blocking case, partial WC OXC, using complicated wavelength assignment algorithm. But the complexities of these schemes decide them to be not in practice in real networks. In this paper, we propose a new scheme based on the node blocking state advertisement to reduce the retry or rerouting probability and improve the efficiency of routing in the networks with blocking nodes. In the scheme, node blocking state are advertised to the other nodes in networks, which will be used for subsequent route calculation to find a path with lowest blocking probability. The performance of the scheme is evaluated using discrete event model in 14-node NSFNET, all the nodes of which employ a kind of partial sharing WC OXC structure. In the simulation, a simple First-Fit wavelength assignment algorithm is used. The simulation results demonstrate that the new scheme considerably reduces the retry or rerouting probability in routing process.

  6. Four-dimensional symmetry from a broad viewpoint. II Invariant distribution of quantized field oscillators and questions on infinities

    NASA Technical Reports Server (NTRS)

    Hsu, J. P.

    1983-01-01

    The foundation of the quantum field theory is changed by introducing a new universal probability principle into field operators: one single inherent and invariant probability distribution P(/k/) is postulated for boson and fermion field oscillators. This can be accomplished only when one treats the four-dimensional symmetry from a broad viewpoint. Special relativity is too restrictive to allow such a universal probability principle. A radical length, R, appears in physics through the probability distribution P(/k/). The force between two point particles vanishes when their relative distance tends to zero. This appears to be a general property for all forces and resembles the property of asymptotic freedom. The usual infinities in vacuum fluctuations and in local interactions, however complicated they may be, are all removed from quantum field theories. In appendix A a simple finite and unitary theory of unified electroweak interactions is discussed without assuming Higgs scalar bosons.

  7. Incorporating big data into treatment plan evaluation: Development of statistical DVH metrics and visualization dashboards.

    PubMed

    Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten

    2017-01-01

    To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.

  8. Chemo-IMRT of Oropharyngeal Cancer Aiming to Reduce Dysphagia: Swallowing Organs Late Complication Probabilities and Dosimetric Correlates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisbruch, Avraham, E-mail: eisbruch@umich.edu; Kim, Hyungjin M.; Feng, Felix Y.

    2011-11-01

    Purpose: Assess dosimetric correlates of long-term dysphagia after chemo-intensity-modulated radiotherapy (IMRT) of oropharyngeal cancer (OPC) sparing parts of the swallowing organs. Patients and Methods: Prospective longitudinal study: weekly chemotherapy concurrent with IMRT for Stages III/IV OPC, aiming to reduce dysphagia by sparing noninvolved parts of swallowing-related organs: pharyngeal constrictors (PC), glottic and supraglottic larynx (GSL), and esophagus, as well as oral cavity and major salivary glands. Dysphagia outcomes included patient-reported Swallowing and Eating Domain scores, Observer-based (CTCAEv.2) dysphagia, and videofluoroscopy (VF), before and periodically after therapy through 2 years. Relationships between dosimetric factors and worsening (from baseline) of dysphagia throughmore » 2 years were assessed by linear mixed-effects model. Results: Seventy-three patients participated. Observer-based dysphagia was not modeled because at >6 months there were only four Grade {>=}2 cases (one of whom was feeding-tube dependent). PC, GSL, and esophagus mean doses, as well as their partial volume doses (V{sub D}s), were each significantly correlated with all dysphagia outcomes. However, the V{sub D}s for each organ intercorrelated and also highly correlated with the mean doses, leaving only mean doses significant. Mean doses to each of the parts of the PCs (superior, middle, and inferior) were also significantly correlated with all dysphagia measures, with superior PCs demonstrating highest correlations. For VF-based strictures, most significant predictor was esophageal mean doses (48{+-}17 Gy in patients with, vs 27{+-}12 in patients without strictures, p = 0.004). Normal tissue complication probabilities (NTCPs) increased moderately with mean doses without any threshold. For increased VF-based aspirations or worsened VF summary scores, toxic doses (TDs){sub 50} and TD{sub 25} were 63 Gy and 56 Gy for PC, and 56 Gy and 39 Gy for GSL, respectively. For both PC and GSL, patient-reported swallowing TDs were substantially higher than VF-based TDs. Conclusions: Swallowing organs mean doses correlated significantly with long-term worsening of swallowing. Different methods assessing dysphagia resulted in different NTCPs, and none demonstrated a threshold.« less

  9. Direct evaluation of radiobiological parameters from clinical data in the case of ion beam therapy: an alternative approach to the relative biological effectiveness.

    PubMed

    Cometto, A; Russo, G; Bourhaleb, F; Milian, F M; Giordanengo, S; Marchetto, F; Cirio, R; Attili, A

    2014-12-07

    The relative biological effectiveness (RBE) concept is commonly used in treatment planning for ion beam therapy. Whether models based on in vitro/in vivo RBE data can be used to predict human response to treatments is an open issue. In this work an alternative method, based on an effective radiobiological parameterization directly derived from clinical data, is presented. The method has been applied to the analysis of prostate cancer trials with protons and carbon ions.Prostate cancer trials with proton and carbon ion beams reporting 5 year-local control (LC5) and grade 2 (G2) or higher genitourinary toxicity rates (TOX) were selected from literature to test the method. Treatment simulations were performed on a representative subset of patients to produce dose and linear energy transfer distribution, which were used as explicative physical variables for the radiobiological modelling. Two models were taken into consideration: the microdosimetric kinetic model (MKM) and a linear model (LM). The radiobiological parameters of the LM and MKM were obtained by coupling them with the tumor control probability and normal tissue complication probability models to fit the LC5 and TOX data through likelihood maximization. The model ranking was based on the Akaike information criterion.Results showed large confidence intervals due to the limited variety of available treatment schedules. RBE values, such as RBE = 1.1 for protons in the treated volume, were derived as a by-product of the method, showing a consistency with current approaches. Carbon ion RBE values were also derived, showing lower values than those assumed for the original treatment planning in the target region, whereas higher values were found in the bladder. Most importantly, this work shows the possibility to infer the radiobiological parametrization for proton and carbon ion treatment directly from clinical data.

  10. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  11. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  12. Estimating relative risks for common outcome using PROC NLP.

    PubMed

    Yu, Binbing; Wang, Zhuoqiao

    2008-05-01

    In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.

  13. Volcanic ash melting under conditions relevant to ash turbine interactions

    PubMed Central

    Song, Wenjia; Lavallée, Yan; Hess, Kai-Uwe; Kueppers, Ulrich; Cimarelli, Corrado; Dingwell, Donald B.

    2016-01-01

    The ingestion of volcanic ash by jet engines is widely recognized as a potentially fatal hazard for aircraft operation. The high temperatures (1,200–2,000 °C) typical of jet engines exacerbate the impact of ash by provoking its melting and sticking to turbine parts. Estimation of this potential hazard is complicated by the fact that chemical composition, which affects the temperature at which volcanic ash becomes liquid, can vary widely amongst volcanoes. Here, based on experiments, we parameterize ash behaviour and develop a model to predict melting and sticking conditions for its global compositional range. The results of our experiments confirm that the common use of sand or dust proxy is wholly inadequate for the prediction of the behaviour of volcanic ash, leading to overestimates of sticking temperature and thus severe underestimates of the thermal hazard. Our model can be used to assess the deposition probability of volcanic ash in jet engines. PMID:26931824

  14. Management and postoperative outcome in primary lung cancer and heart disease co-morbidity: a systematic review and meta-analysis

    PubMed Central

    Analitis, Antonis; Michaelides, Stylianos A.; Charalabopoulos, Konstantinos A.; Tzonou, Anastasia

    2016-01-01

    Background Co-morbidity of primary lung cancer (LC) and heart disease (HD), both requiring surgical therapy, characterizes a high risk group of patients necessitating prompt diagnosis and treatment. The aim of this study is the review of available evidence guiding the management of these patients. Methods Postoperative outcome of patients operated for primary LC (first meta-analysis) and for both primary LC and HD co-morbidity (second meta-analysis), were studied. Parameters examined in both meta-analyses were thirty-day postoperative mortality, postoperative complications, three- and five-year survival probabilities. The last 36 years were reviewed by using the PubMed data base. Thirty-seven studies were qualified for both meta-analyses. Results The pooled 30-day mortality percentages (%) were 4.16% [95% confidence interval (CI): 2.68–5.95] (first meta-analysis) and 5.26% (95% CI: 3.47–7.62) (second meta-analysis). Higher percentages of squamous histology and lobectomy, were significantly associated with increased (P=0.001) and decreased (P<0.001) thirty-day postoperative mortality, respectively (first meta-analysis). The pooled percentages for postoperative complications were 34.32% (95% CI: 24.59–44.75) (first meta-analysis) and 45.59% (95% CI: 35.62–55.74) (second meta-analysis). Higher percentages of squamous histology (P=0.001), lobectomy (P=0.002) and p-T1 or p-T2 (P=0.034) were associated with higher proportions of postoperative complications (second meta-analysis). The pooled three- and five- year survival probabilities were 68.25% (95% CI: 45.93–86.86) and 52.03% (95% CI: 34.71–69.11), respectively. Higher mean age (P=0.046) and percentage lobectomy (P=0.009) significantly reduced the five-year survival probability. Conclusions Lobectomy and age were both accompanied by reduced five-year survival rate. Also, combined aorto-coronary bypass grafting (CABG) with lobectomy for squamous pT1 or pT2 LC displayed a higher risk of postoperative complications. Moreover, medical decision between combined or staged surgery is suggested to be individualized based on adequacy of coronary arterial perfusion, age, patient’s preoperative performance status (taking into account possible co-morbidities per patient), tumor’s staging and extent of lung resection. PMID:27386487

  15. The development and validation of a novel model for predicting surgical complications in colorectal cancer of elderly patients: Results from 1008 cases.

    PubMed

    Shen, Zhanlong; Lin, Yuanpei; Ye, Yingjiang; Jiang, Kewei; Xie, Qiwei; Gao, Zhidong; Wang, Shan

    2018-04-01

    To establish predicting models of surgical complications in elderly colorectal cancer patients. Surgical complications are usually critical and lethal in the elderly patients. However, none of the current models are specifically designed to predict surgical complications in elderly colorectal cancer patients. Details of 1008 cases of elderly colorectal cancer patients (age ≥ 65) were collected retrospectively from January 1998 to December 2013. Seventy-six clinicopathological variables which might affect postoperative complications in elderly patients were recorded. Multivariate stepwise logistic regression analysis was used to develop the risk model equations. The performance of the developed model was evaluated by measures of calibration (Hosmer-Lemeshow test) and discrimination (the area under the receiver-operator characteristic curve, AUC). The AUC of our established Surgical Complication Score for Elderly Colorectal Cancer patients (SCSECC) model was 0.743 (sensitivity, 82.1%; specificity, 78.3%). There was no significant discrepancy between observed and predicted incidence rates of surgical complications (AUC, 0.820; P = .812). The Surgical Site Infection Score for Elderly Colorectal Cancer patients (SSISECC) model showed significantly better prediction power compared to the National Nosocomial Infections Surveillance index (NNIS) (AUC, 0.732; P ˂ 0.001) and Efficacy of Nosocomial Infection Control index (SENIC) (AUC; 0.686; P˂0.001) models. The SCSECC and SSISECC models show good prediction power for postoperative surgical complication morbidity and surgical site infection in elderly colorectal cancer patients. Copyright © 2018 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.

  16. Event Rates, Hospital Utilization, and Costs Associated with Major Complications of Diabetes: A Multicountry Comparative Analysis

    PubMed Central

    Clarke, Philip M.; Glasziou, Paul; Patel, Anushka; Chalmers, John; Woodward, Mark; Harrap, Stephen B.; Salomon, Joshua A.

    2010-01-01

    Background Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries. Methods and Findings Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE) study (mean age at entry 66 y). The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies) using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$), which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%–96% across regions) and lowest for nephropathy (15%–26%). The average numbers of days in hospital given at least one admission were greatest for stroke (17–32 d across region) and heart failure (16–31 d) and lowest for nephropathy (12–23 d). Considering regional differences, probabilities of hospitalization were lowest in Asia and highest in Established Market Economies; on the other hand, lengths of stay were highest in Asia and lowest in Established Market Economies. Overall estimated annual hospital costs for patients with none of the specified events or event histories ranged from Int$76 in Asia to Int$296 in Established Market Economies. All complications included in this analysis led to significant increases in hospital costs; coronary events, cerebrovascular events, and heart failure were the most costly, at more than Int$1,800, Int$3,000, and Int$4,000 in Asia, Eastern Europe, and Established Market Economies, respectively. Conclusions Major complications of diabetes significantly increase hospital use and costs across various settings and are likely to impose a high economic burden on health care systems. Please see later in the article for the Editors' Summary PMID:20186272

  17. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  18. Hematoma formation during breast core needle biopsy in women taking antithrombotic therapy.

    PubMed

    Chetlen, Alison L; Kasales, Claudia; Mack, Julie; Schetter, Susann; Zhu, Junjia

    2013-07-01

    The purpose of this study was to compare hematoma formation after breast core needle biopsy performed on patients undergoing and those not undergoing concurrent antithrombotic therapy. A prospective assessment of core needle biopsies (stereotactic, ultrasound guided, or MRI guided) performed on patients enrolled between September 2011 and July 2012 formed the basis of this study. Postprocedure mediolateral and craniocaudal mammograms were evaluated for the presence and size of hematomas. Patients were clinically evaluated for complications 24-48 hours after the procedure through telephone call or face-to-face consultation. Needle size, type of biopsy, and presence of hematoma and documented complications were correlated with use of antithrombotic agents (including aspirin, warfarin, clopidogrel, and daily nonsteroidal antiinflammatory medications). No clinically significant hematomas or bleeding complications were found. Eighty-nine of 617 (14.4%) non-clinically significant hematomas were detected on postprocedure mammograms. The probability of development of a non-clinically significant hematoma was 21.6% for patients taking antithrombotics and 13.0% for those not taking antithrombotics. Concurrent antithrombotic therapy and larger needle gauge were significant factors contributing to the probability of hematoma formation. The volume of the hematoma was not related to needle gauge or presence of antithrombotic therapy. No clinically significant hematomas were found. Because there are potential life-threatening risks to stopping antithrombotic therapy before breast biopsy, withholding antithrombotic therapy for core needle breast biopsy is not recommended because the incidence of non-clinically significant hematoma is low.

  19. LASSO NTCP predictors for the incidence of xerostomia in patients with head and neck squamous cell carcinoma and nasopharyngeal carcinoma

    PubMed Central

    Lee, Tsair-Fwu; Liou, Ming-Hsiang; Huang, Yu-Jie; Chao, Pei-Ju; Ting, Hui-Min; Lee, Hsiao-Yi

    2014-01-01

    To predict the incidence of moderate-to-severe patient-reported xerostomia among head and neck squamous cell carcinoma (HNSCC) and nasopharyngeal carcinoma (NPC) patients treated with intensity-modulated radiotherapy (IMRT). Multivariable normal tissue complication probability (NTCP) models were developed by using quality of life questionnaire datasets from 152 patients with HNSCC and 84 patients with NPC. The primary endpoint was defined as moderate-to-severe xerostomia after IMRT. The numbers of predictive factors for a multivariable logistic regression model were determined using the least absolute shrinkage and selection operator (LASSO) with bootstrapping technique. Four predictive models were achieved by LASSO with the smallest number of factors while preserving predictive value with higher AUC performance. For all models, the dosimetric factors for the mean dose given to the contralateral and ipsilateral parotid gland were selected as the most significant predictors. Followed by the different clinical and socio-economic factors being selected, namely age, financial status, T stage, and education for different models were chosen. The predicted incidence of xerostomia for HNSCC and NPC patients can be improved by using multivariable logistic regression models with LASSO technique. The predictive model developed in HNSCC cannot be generalized to NPC cohort treated with IMRT without validation and vice versa. PMID:25163814

  20. The Impact of Monte Carlo Dose Calculations on Intensity-Modulated Radiation Therapy

    NASA Astrophysics Data System (ADS)

    Siebers, J. V.; Keall, P. J.; Mohan, R.

    The effect of dose calculation accuracy for IMRT was studied by comparing different dose calculation algorithms. A head and neck IMRT plan was optimized using a superposition dose calculation algorithm. Dose was re-computed for the optimized plan using both Monte Carlo and pencil beam dose calculation algorithms to generate patient and phantom dose distributions. Tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP) were computed to estimate the plan outcome. For the treatment plan studied, Monte Carlo best reproduces phantom dose measurements, the TCP was slightly lower than the superposition and pencil beam results, and the NTCP values differed little.

  1. Transient Ischemic Rectitis as a Potential Complication after Prostatic Artery Embolization: Case Report and Review of the Literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreira, Airton Mota, E-mail: motamoreira@gmail.com; Marques, Carlos Frederico Sparapan, E-mail: sparapanmarques@gmail.com; Antunes, Alberto Azoubel, E-mail: antunesuro@uol.com.br

    Prostatic artery embolization (PAE) is an alternative treatment for benign prostatic hyperplasia. Complications are primarily related to non-target embolization. We report a case of ischemic rectitis in a 76-year-old man with significant lower urinary tract symptoms due to benign prostatic hyperplasia, probably related to nontarget embolization. Magnetic resonance imaging revealed an 85.5-g prostate and urodynamic studies confirmed Inferior vesical obstruction. PAE was performed bilaterally. During the first 3 days of follow-up, a small amount of blood mixed in the stool was observed. Colonoscopy identified rectal ulcers at day 4, which had then disappeared by day 16 post PAE without treatment.more » PAE is a safe, effective procedure with a low complication rate, but interventionalists should be aware of the risk of rectal nontarget embolization.« less

  2. Insulin oedema in a child with newly diagnosed diabetes mellitus.

    PubMed

    Aravamudhan, Avinash; Gardner, Chris; Smith, Claire; Senniappan, Senthil

    2014-05-01

    Insulin oedema is a rare complication of insulin therapy for diabetes mellitus. It has been reported in type 1 diabetes mellitus, in poorly controlled type 2 diabetes mellitus following either the initiation or intensification of insulin therapy and in underweight patients on large doses of insulin. There are only a few case reports since it was first described in 1928, showing that it is an uncommon and probably an under-reported complication. The majority of those reports have been in the adult population. The generalised oedema tends to develop shortly after initiation or intensification of insulin therapy and resolves spontaneously within few weeks. We present one of the youngest patients reported in the literature, a 9-year-old boy who developed insulin oedema within few days of presenting with diabetic ketoacidosis. The case highlights the importance of recognising this generally transient and self-resolving complication and differentiating it from other serious causes of oedema.

  3. Correction of sampling bias in a cross-sectional study of post-surgical complications.

    PubMed

    Fluss, Ronen; Mandel, Micha; Freedman, Laurence S; Weiss, Inbal Salz; Zohar, Anat Ekka; Haklai, Ziona; Gordon, Ethel-Sherry; Simchen, Elisheva

    2013-06-30

    Cross-sectional designs are often used to monitor the proportion of infections and other post-surgical complications acquired in hospitals. However, conventional methods for estimating incidence proportions when applied to cross-sectional data may provide estimators that are highly biased, as cross-sectional designs tend to include a high proportion of patients with prolonged hospitalization. One common solution is to use sampling weights in the analysis, which adjust for the sampling bias inherent in a cross-sectional design. The current paper describes in detail a method to build weights for a national survey of post-surgical complications conducted in Israel. We use the weights to estimate the probability of surgical site infections following colon resection, and validate the results of the weighted analysis by comparing them with those obtained from a parallel study with a historically prospective design. Copyright © 2012 John Wiley & Sons, Ltd.

  4. Male greater sage-grouse detectability on leks

    Treesearch

    Aleshia L. Fremgen; Christopher P. Hansen; Mark A. Rumble; R. Scott Gamo; Joshua J. Millspaugh

    2016-01-01

    It is unlikely all male sage-grouse are detected during lek counts, which could complicate the use of lek counts as an index to population abundance. Understanding factors that influence detection probabilities will allow managers to more accurately estimate the number of males present on leks. We fitted 410 males with global positioning system and very high...

  5. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the information the model-based probability is conditioned on holds. It is argued that no model-based climate-like probability forecast is complete without a quantitative estimate of its own irrelevance, and that the clear identification of model-based probability forecasts as mature or immature, are critical elements for maintaining the credibility of science-based decision support, and can shape uncertainty quantification more widely.

  6. Probability Weighting Functions Derived from Hyperbolic Time Discounting: Psychophysical Models and Their Individual Level Testing.

    PubMed

    Takemura, Kazuhisa; Murakami, Hajime

    2016-01-01

    A probability weighting function (w(p)) is considered to be a nonlinear function of probability (p) in behavioral decision theory. This study proposes a psychophysical model of probability weighting functions derived from a hyperbolic time discounting model and a geometric distribution. The aim of the study is to show probability weighting functions from the point of view of waiting time for a decision maker. Since the expected value of a geometrically distributed random variable X is 1/p, we formulized the probability weighting function of the expected value model for hyperbolic time discounting as w(p) = (1 - k log p)(-1). Moreover, the probability weighting function is derived from Loewenstein and Prelec's (1992) generalized hyperbolic time discounting model. The latter model is proved to be equivalent to the hyperbolic-logarithmic weighting function considered by Prelec (1998) and Luce (2001). In this study, we derive a model from the generalized hyperbolic time discounting model assuming Fechner's (1860) psychophysical law of time and a geometric distribution of trials. In addition, we develop median models of hyperbolic time discounting and generalized hyperbolic time discounting. To illustrate the fitness of each model, a psychological experiment was conducted to assess the probability weighting and value functions at the level of the individual participant. The participants were 50 university students. The results of individual analysis indicated that the expected value model of generalized hyperbolic discounting fitted better than previous probability weighting decision-making models. The theoretical implications of this finding are discussed.

  7. Comparison between the four-field box and field-in-field techniques for conformal radiotherapy of the esophagus using dose-volume histograms and normal tissue complication probabilities.

    PubMed

    Allaveisi, Farzaneh; Moghadam, Amir Nami

    2017-06-01

    We evaluated and compared the performance of the field-in-field (FIF) to that of the four-field box (4FB) technique regarding dosimetric and radiobiological parameters for radiotherapy of esophageal carcinoma. Twenty patients with esophageal cancer were selected. For each patient, two treatment plans were created: 4FB and FIF. The parameters compared included the conformity index (CI), homogeneity index (HI), D mean , D max , tumor control probability (TCP), V 20Gy and V 30Gy of the heart and lungs, normal tissue complication probability (NTCP), and monitor units per fraction (MU/fr). A paired t-test analysis did not show any significant differences (p > 0.05) between the two techniques in terms of the CI and TCP. However, the HI significantly improved when the FIF was applied. D max of the PTV, lung, and spinal cord were also significantly better with the FIF. Moreover, the lung V 20Gy as well as the NTCPs of the lung and spinal cord significantly reduced when the FIF was used, and the MU/fr was significantly decreased. The FIF showed evident advantages over 4FB: a more homogeneous dose distribution, lower D max values, and fewer required MUs, while it also retained PTV dose conformality. FIF should be considered as a simple technique to use clinically in cases with esophageal malignancies, especially in clinics with no IMRT.

  8. Analyzing cost-effectiveness of ulnar and median nerve transfers to regain forearm flexion.

    PubMed

    Wali, Arvin R; Park, Charlie C; Brown, Justin M; Mandeville, Ross

    2017-03-01

    OBJECTIVE Peripheral nerve transfers to regain elbow flexion via the ulnar nerve (Oberlin nerve transfer) and median nerves are surgical options that benefit patients. Prior studies have assessed the comparative effectiveness of ulnar and median nerve transfers for upper trunk brachial plexus injury, yet no study has examined the cost-effectiveness of this surgery to improve quality-adjusted life years (QALYs). The authors present a cost-effectiveness model of the Oberlin nerve transfer and median nerve transfer to restore elbow flexion in the adult population with upper brachial plexus injury. METHODS Using a Markov model, the authors simulated ulnar and median nerve transfers and conservative measures in terms of neurological recovery and improvements in quality of life (QOL) for patients with upper brachial plexus injury. Transition probabilities were collected from previous studies that assessed the surgical efficacy of ulnar and median nerve transfers, complication rates associated with comparable surgical interventions, and the natural history of conservative measures. Incremental cost-effectiveness ratios (ICERs), defined as cost in dollars per QALY, were calculated. Incremental cost-effectiveness ratios less than $50,000/QALY were considered cost-effective. One-way and 2-way sensitivity analyses were used to assess parameter uncertainty. Probabilistic sampling was used to assess ranges of outcomes across 100,000 trials. RESULTS The authors' base-case model demonstrated that ulnar and median nerve transfers, with an estimated cost of $5066.19, improved effectiveness by 0.79 QALY over a lifetime compared with conservative management. Without modeling the indirect cost due to loss of income over lifetime associated with elbow function loss, surgical treatment had an ICER of $6453.41/QALY gained. Factoring in the loss of income as indirect cost, surgical treatment had an ICER of -$96,755.42/QALY gained, demonstrating an overall lifetime cost savings due to increased probability of returning to work. One-way sensitivity analysis demonstrated that the model was most sensitive to assumptions about cost of surgery, probability of good surgical outcome, and spontaneous recovery of neurological function with conservative treatment. Two-way sensitivity analysis demonstrated that surgical intervention was cost-effective with an ICER of $18,828.06/QALY even with the authors' most conservative parameters with surgical costs at $50,000 and probability of success of 50% when considering the potential income recovered through returning to work. Probabilistic sampling demonstrated that surgical intervention was cost-effective in 76% of cases at a willingness-to-pay threshold of $50,000/QALY gained. CONCLUSIONS The authors' model demonstrates that ulnar and median nerve transfers for upper brachial plexus injury improves QALY in a cost-effective manner.

  9. Radiation and depression associated with complications of tissue expander reconstruction.

    PubMed

    Chuba, Paul J; Stefani, William A; Dul, Carrie; Szpunar, Susan; Falk, Jeffrey; Wagner, Rachael; Edhayan, Elango; Rabbani, Anna; Browne, Cynthia H; Aref, Amr

    2017-08-01

    Rates of implant failure, wound healing delay, and infection are higher in patients having radiation therapy (RT) after tissue expander (TE) and permanent implant reconstruction. We investigated pretreatment risk factors for TE implant complications. 127 breast cancer patients had TE reconstruction and radiation. For 85 cases of bilateral TE reconstruction, the non-irradiated breast provided an internal control. Comparison of differences in means for continuous variables used analysis of variance, then multiple pairwise comparisons with Bonferroni correction of p value. Mean age was 53 ± 10.1 years with 14.6% African-American. Twelve (9.4%) were BRCA positive (9 BRCA1, 4 BRCA2, 1 Both). Complications were: Grade 0 (no complication; 43.9%), Grade 1 (tightness and/or drifting of implant or Baker Grade II capsular contracture; 30.9%), Grade 2 (infection, hypertrophic scarring, or incisional necrosis; 9.8%), Grade 3 (Baker Grade III capsular contracture, wound dehiscence, or impending exposure of implant; 5.7%), Grade 4 (implant failure, exchange of implant, or Baker Grade IV capsular contracture; 9.8%). 15.3% (19 cases) experienced Grade 3 or 4 complication and 9.8% (12 cases) had Grade 4 complication. Considering non-irradiated breasts, there were two (1.6%) Grade 3-4 complications. For BMI, there was no significant difference by category as defined by the CDC (p = 0.91). Patients with depression were more likely to experience Grade 3 or 4 complication (29.4 vs 13.2%; p = 0.01). Using multiple logistic regression to predict the probability of a Grade 3 or 4 complications in patients with depression were found to be 4.2 times more likely to have a Grade 3 or 4 complication (OR = 4.2, p = 0.03). Higher rates of TE reconstruction complications are expected in patients receiving radiotherapy. An unexpected finding was that patients reporting medical history of depression showed statistically significant increase in complication rates.

  10. Acute Parotitis after Lower Limb Amputation: A Case Report of a Rare Complication.

    PubMed

    Avgerinos, Konstantinos Ioannis; Degermetzoglou, Nikolaos; Theofanidou, Sofia; Kritikou, Georgia; Bountouris, Ioannis

    2018-01-01

    Postoperative parotitis is a rare complication that occurs usually after abdominal surgery. Parotitis has never been described as a complication of vascular operations, in literature. In the present article, we describe a case of a postamputation parotitis along with its management and its possible pathogenesis. An 83-year-old diabetic man was emergently admitted to hospital because of gangrene below the right ankle and sepsis. The patient underwent a lower limb amputation above the knee. On the 5th postoperative day, he was diagnosed with right parotitis probably because of dehydration, general anesthesia, and immunocompromisation. A CT scan confirmed the diagnosis. He received treatment with antibiotics and fluids. His condition gradually improved, and he was finally discharged on 15th postoperative day. Postoperative parotitis can possibly occur after any type of surgery including vascular. Clinicians should be aware of this complication although it is rare. Several risk factors such as dehydration, general anesthesia, drugs, immunocompromisation, head tilt during surgery, and stones in Stensen's duct may predispose to postoperative parotitis. Treatment consists of antibiotics and hydration.

  11. [Pentosidine: a new biomarker in diabetes mellitus complications].

    PubMed

    Morales, Sonia; García-Salcedo, José A; Muñoz-Torres, Manuel

    2011-03-19

    Diabetes mellitus causes an increase of morbidity and mortality. Advanced glycosilation end products (AGE) are formed by non-enzymatic glycation between proteins and reducing sugars as glucose. Oxidative reactions (glycoxidations) are essential for the formation of some AGE, for example pentosidine. Increased concentrations of pentosidine can be found in pathological conditions associated with hyperglycaemia and also related to increased oxidative stress. In individuals with diabetes mellitus, pentosidine formation and accumulation is developed at an accelerated rate in cells without insulin control for glucose uptake. Pentosidine has a pivotal role in diabetic complications, probably as a consequence of the diverse properties of this compound, which alters the structure and function of molecules in biological systems. The following review discusses the alterations in the concentration of pentosidine in the body, particularly in relation to changes occurring in diabetes and its complications such as vascular and bone disease, nephropathy, neuropathy and retinopathy. Novel therapeutic approaches which can prevent or ameliorate the toxic effects of AGE in the initiation and progression of diabetic complications are reviewed. Copyright © 2009 Elsevier España, S.L. All rights reserved.

  12. Pathological mandibular fracture: A severe complication of periimplantitis

    PubMed Central

    Rodriguez-Campo, Francisco; Naval-Parra, Beatriz; Sastre-Pérez, Jesús

    2015-01-01

    Nowadays, dental implant treatment is a very common option for patients even in medical compromised conditons. Some complications related to them have been described. Periimplantitis (PI) is one of the biggest concerns complications of these kind of treatments, probably has a multifactorial aethiology. Usually the consequences of PI are the loss of the implants and prostheses, expenses of money and time for dentists and patients. Very often PI implies the necesity of repeating the treatment . Pathological mandibular fracture due to PI is a severe but infrequent complication after dental implant treatment, especially after PI. In this study we present three cases of mandibular pathologic fractures among patients with different medical and dental records but similar management: two of them had been treated years ago of oral squamous cell carcinoma with surgery and radiotherapy, the other patient received oral bisphosphonates for osteoporosis some years after implantation. We analized the causes, consequences and posible prevention of these fractures as well as the special features of this kind of mandibular fractures and the different existing treatments. Key words:Periimplantitis, pathological mandibular fracture, mandibular atrophy, bicortical implants. PMID:26155355

  13. Pathological mandibular fracture: A severe complication of periimplantitis.

    PubMed

    Naval-Gías, Luis; Rodriguez-Campo, Francisco; Naval-Parra, Beatriz; Sastre-Pérez, Jesús

    2015-04-01

    Nowadays, dental implant treatment is a very common option for patients even in medical compromised conditons. Some complications related to them have been described. Periimplantitis (PI) is one of the biggest concerns complications of these kind of treatments, probably has a multifactorial aethiology. Usually the consequences of PI are the loss of the implants and prostheses, expenses of money and time for dentists and patients. Very often PI implies the necesity of repeating the treatment . Pathological mandibular fracture due to PI is a severe but infrequent complication after dental implant treatment, especially after PI. In this study we present three cases of mandibular pathologic fractures among patients with different medical and dental records but similar management: two of them had been treated years ago of oral squamous cell carcinoma with surgery and radiotherapy, the other patient received oral bisphosphonates for osteoporosis some years after implantation. We analized the causes, consequences and posible prevention of these fractures as well as the special features of this kind of mandibular fractures and the different existing treatments. Key words:Periimplantitis, pathological mandibular fracture, mandibular atrophy, bicortical implants.

  14. Sensitivity of NTCP parameter values against a change of dose calculation algorithm.

    PubMed

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-01

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.

  15. Sensitivity of NTCP parameter values against a change of dose calculation algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-09-15

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis withmore » those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models.« less

  16. Propagation as a mechanism of reorientation of the Juan de Fuca ridge

    NASA Technical Reports Server (NTRS)

    Wilson, D. S.; Hey, R. N.; Nishimura, C.

    1984-01-01

    A revised model is presented of the tectonic evolution of the Juan de Fuca ridge by propagating rifting. The new model has three different relative rotation poles, covering the time intervals 17.0-8.5 Ma, 8.5-5.0 Ma, and 5.0 Ma to the present. The rotation pole shifts at 8.5 and 5.0 Ma imply clockwise shifts in the direction of relative motion of 10 deg to 15 deg. At each of these shifts, the pattern of propagation reorganizes, and the new ridges formed by propagation are at an orientation closer to orthogonal to the new direction of motion than the orientation of the preexisting ridges. The model, containing a total of seven propagation sequences, shows excellent agreement with the isochrons inferred from the magnetic anomaly data, except in areas complicated by the separate Explorer and Gorda plates. The agreement between model and data near the Explorer plate breaks down abruptly at an age of about 5 Ma, indicating that the probable cause of the rotation pole shift at that time was the separation of the Explorer plate from the Juan de Fuca plate.

  17. The Quest for Evidence for Proton Therapy: Model-Based Approach and Precision Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widder, Joachim, E-mail: j.widder@umcg.nl; Schaaf, Arjen van der; Lambin, Philippe

    Purpose: Reducing dose to normal tissues is the advantage of protons versus photons. We aimed to describe a method for translating this reduction into a clinically relevant benefit. Methods and Materials: Dutch scientific and health care governance bodies have recently issued landmark reports regarding generation of relevant evidence for new technologies in health care including proton therapy. An approach based on normal tissue complication probability (NTCP) models has been adopted to select patients who are most likely to experience fewer (serious) adverse events achievable by state-of-the-art proton treatment. Results: By analogy with biologically targeted therapies, the technology needs to be testedmore » in enriched cohorts of patients exhibiting the decisive predictive marker: difference in normal tissue dosimetric signatures between proton and photon treatment plans. Expected clinical benefit is then estimated by virtue of multifactorial NTCP models. In this sense, high-tech radiation therapy falls under precision medicine. As a consequence, randomizing nonenriched populations between photons and protons is predictably inefficient and likely to produce confusing results. Conclusions: Validating NTCP models in appropriately composed cohorts treated with protons should be the primary research agenda leading to urgently needed evidence for proton therapy.« less

  18. Robust versus consistent variance estimators in marginal structural Cox models.

    PubMed

    Enders, Dirk; Engel, Susanne; Linder, Roland; Pigeot, Iris

    2018-06-11

    In survival analyses, inverse-probability-of-treatment (IPT) and inverse-probability-of-censoring (IPC) weighted estimators of parameters in marginal structural Cox models are often used to estimate treatment effects in the presence of time-dependent confounding and censoring. In most applications, a robust variance estimator of the IPT and IPC weighted estimator is calculated leading to conservative confidence intervals. This estimator assumes that the weights are known rather than estimated from the data. Although a consistent estimator of the asymptotic variance of the IPT and IPC weighted estimator is generally available, applications and thus information on the performance of the consistent estimator are lacking. Reasons might be a cumbersome implementation in statistical software, which is further complicated by missing details on the variance formula. In this paper, we therefore provide a detailed derivation of the variance of the asymptotic distribution of the IPT and IPC weighted estimator and explicitly state the necessary terms to calculate a consistent estimator of this variance. We compare the performance of the robust and consistent variance estimators in an application based on routine health care data and in a simulation study. The simulation reveals no substantial differences between the 2 estimators in medium and large data sets with no unmeasured confounding, but the consistent variance estimator performs poorly in small samples or under unmeasured confounding, if the number of confounders is large. We thus conclude that the robust estimator is more appropriate for all practical purposes. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    NASA Astrophysics Data System (ADS)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin, particularly for annual maxima of the FWI distribution and spatiotemporal autocorrelation of precipitation fields.

  20. Climate sensitivity estimated from temperature reconstructions of the Last Glacial Maximum

    NASA Astrophysics Data System (ADS)

    Schmittner, A.; Urban, N.; Shakun, J. D.; Mahowald, N. M.; Clark, P. U.; Bartlein, P. J.; Mix, A. C.; Rosell-Melé, A.

    2011-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the information the model-based probability is conditioned on holds. It is argued that no model-based climate-like probability forecast is complete without a quantitative estimate of its own irrelevance, and that the clear identification of model-based probability forecasts as mature or immature, are critical elements for maintaining the credibility of science-based decision support, and can shape uncertainty quantification more widely.

  1. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  2. Rational choice of peritoneal dialysis catheter.

    PubMed

    Dell'Aquila, Roberto; Chiaramonte, Stefano; Rodighiero, Maria Pia; Spanó, Emilia; Di Loreto, Pierluigi; Kohn, Catalina Ocampo; Cruz, Dinna; Polanco, Natalia; Kuang, Dingwei; Corradi, Valentina; De Cal, Massimo; Ronco, Claudio

    2007-06-01

    The peritoneal catheter should be a permanent and safe access to the peritoneal cavity. Catheter-related problems are often the cause of permanent transfer to hemodialysis (HD) in up to 20% of peritoneal dialysis (PD) patients; in some cases, these problems require a temporary period on HD. Advances in connectology have reduced the incidence of peritonitis, and so catheter-related complications during PD have become a major concern. In the last few years, novel techniques have emerged in the field of PD: new dialysis solutions, better connectology, and cyclers for automated PD. However, extracorporeal dialysis has continued to improve in terms of methods and patient survival, but PD has failed to do so. The main reason is that peritoneal access has remained problematical. The peritoneal catheter is the major obstacle to wide-spread use of PD. Overcoming catheter-related problems means giving a real chance to development of the peritoneal technique. Catheters should be as efficient, safe, and acceptable as possible. Since its introduction in the mid-1960s, the Tenckhoff catheter has not become obsolete: dozens of new models have been proposed, but none has significantly reduced the pre-dominance of the first catheter. No convincing prospective data demonstrate the superiority of any peritoneal catheter, and so it seems that factors other than choice of catheter are what affect survival and complication rates. Efforts to improve peritoneal catheter survival and complication rates should probably focus on factors other than the choice of catheter. The present article provides an overview of the characteristics of the best-known peritoneal catheters.

  3. Drought Water Right Curtailment

    NASA Astrophysics Data System (ADS)

    Walker, W.; Tweet, A.; Magnuson-Skeels, B.; Whittington, C.; Arnold, B.; Lund, J. R.

    2016-12-01

    California's water rights system allocates water based on priority, where lower priority, "junior" rights are curtailed first in a drought. The Drought Water Rights Allocation Tool (DWRAT) was developed to integrate water right allocation models with legal objectives to suggest water rights curtailments during drought. DWRAT incorporates water right use and priorities with a flow-forecasting model to mathematically represent water law and hydrology and suggest water allocations among water rights holders. DWRAT is compiled within an Excel workbook, with an interface and an open-source solver. By implementing California water rights law as an algorithm, DWRAT provides a precise and transparent framework for the complicated and often controversial technical aspects of curtailing water rights use during drought. DWRAT models have been developed for use in the Eel, Russian, and Sacramento river basins. In this study, an initial DWRAT model has been developed for the San Joaquin watershed, which incorporates all water rights holders in the basin and reference gage flows for major tributaries. The San Joaquin DWRAT can assess water allocation reliability by determining probability of rights holders' curtailment for a range of hydrologic conditions. Forecasted flow values can be input to the model to provide decision makers with the ability to make curtailment and water supply strategy decisions. Environmental flow allocations will be further integrated into the model to protect and improve ecosystem water reliability.

  4. Normal tissue complication probability (NTCP) modelling using spatial dose metrics and machine learning methods for severe acute oral mucositis resulting from head and neck radiotherapy.

    PubMed

    Dean, Jamie A; Wong, Kee H; Welsh, Liam C; Jones, Ann-Britt; Schick, Ulrike; Newbold, Kate L; Bhide, Shreerang A; Harrington, Kevin J; Nutting, Christopher M; Gulliford, Sarah L

    2016-07-01

    Severe acute mucositis commonly results from head and neck (chemo)radiotherapy. A predictive model of mucositis could guide clinical decision-making and inform treatment planning. We aimed to generate such a model using spatial dose metrics and machine learning. Predictive models of severe acute mucositis were generated using radiotherapy dose (dose-volume and spatial dose metrics) and clinical data. Penalised logistic regression, support vector classification and random forest classification (RFC) models were generated and compared. Internal validation was performed (with 100-iteration cross-validation), using multiple metrics, including area under the receiver operating characteristic curve (AUC) and calibration slope, to assess performance. Associations between covariates and severe mucositis were explored using the models. The dose-volume-based models (standard) performed equally to those incorporating spatial information. Discrimination was similar between models, but the RFCstandard had the best calibration. The mean AUC and calibration slope for this model were 0.71 (s.d.=0.09) and 3.9 (s.d.=2.2), respectively. The volumes of oral cavity receiving intermediate and high doses were associated with severe mucositis. The RFCstandard model performance is modest-to-good, but should be improved, and requires external validation. Reducing the volumes of oral cavity receiving intermediate and high doses may reduce mucositis incidence. Copyright © 2016 The Author(s). Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Exenatide versus insulin glargine in patients with type 2 diabetes in the UK: a model of long-term clinical and cost outcomes.

    PubMed

    Ray, Joshua A; Boye, Kristina S; Yurgin, Nicole; Valentine, William J; Roze, Stéphane; McKendrick, Jan; Tucker, Daniel M D; Foos, Volker; Palmer, Andrew J

    2007-03-01

    The aim of this study was to evaluate the long-term clinical and economic outcomes associated with exenatide or insulin glargine, added to oral therapy in individuals with type 2 diabetes inadequately controlled with combination oral agents in the UK setting. A published and validated computer simulation model of diabetes was used to project long-term complications, life expectancy, quality-adjusted life expectancy and direct medical costs. Probabilities of diabetes-related complications were derived from published sources. Treatment effects and patient characteristics were extracted from a recent randomised controlled trial comparing exenatide with insulin glargine. Simulations incorporated published quality of life utilities and UK-specific costs from 2004. Pharmacy costs for exenatide were based on 20, 40, 60, 80 and 100% of the US value (as no price for the UK was available at the time of analysis). Future costs and clinical benefits were discounted at 3.5% annually. Sensitivity analyses were performed. In the base-case analysis exenatide was associated with improvements in life expectancy of 0.057 years and in quality-adjusted life expectancy of 0.442 quality-adjusted life years (QALYs) versus insulin glargine. Long-term projections demonstrated that exenatide was associated with a lower cumulative incidence of most cardiovascular disease (CVD) complications and CVD-related death than insulin glargine. Using the range of cost values, evaluation results showed that exenatide is likely to fall in a range between dominant (cost and life saving) at 20% of the US price and cost-effective (with an ICER of 22,420 pounds per QALY gained) at 100% of the US price, versus insulin glargine. Based on the findings of a recent clinical trial, long-term projections indicated that exenatide is likely to be associated with improvement in life expectancy and quality-adjusted life expectancy compared to insulin glargine. The results from this modelling analysis suggest that that exenatide is likely to represent good value for money by generally accepted standards in the UK setting in individuals with type 2 diabetes inadequately controlled on oral therapy.

  6. Survival and complications in thalassemia.

    PubMed

    Borgna-Pignatti, C; Cappellini, M D; De Stefano, P; Del Vecchio, G C; Forni, G L; Gamberini, M R; Ghilardi, R; Origa, R; Piga, A; Romeo, M A; Zhao, H; Cnaan, A

    2005-01-01

    The life expectancy of patients with thalassemia major has significantly increased in recent years, as reported by several groups in different countries. However, complications are still frequent and affect the patients' quality of life. In a recent study from the United Kingdom, it was found that 50% of the patients had died before age 35. At that age, 65% of the patients from an Italian long-term study were still alive. Heart disease is responsible for more than half of the deaths. The prevalence of complications in Italian patients born after 1970 includes heart failure in 7%, hypogonadism in 55%, hypothyroidism in 11%, and diabetes in 6%. Similar data were reported in patients from the United States. In the Italian study, lower ferritin levels were associated with a lower probability of experiencing heart failure and with prolonged survival. Osteoporosis and osteopenia are common and affect virtually all patients. Hepatitis C virus antibodies are present in 85% of multitransfused Italian patients, 23% of patients in the United Kingdom, 35% in the United States, 34% in France, and 21% in India. Hepatocellular carcinoma can complicate the course of hepatitis. A survey of Italian centers has identified 23 such cases in patients with a thalassemia syndrome. In conclusion, rates of survival and complication-free survival continue to improve, due to better treatment strategies. New complications are appearing in long-term survivors. Iron overload of the heart remains the main cause of morbidity and mortality.

  7. Comparison of Risk Scores for Prediction of Complications following Aortic Valve Replacement.

    PubMed

    Wang, Tom Kai Ming; Choi, David Hyun-Min; Haydock, David; Gamble, Greg; Stewart, Ralph; Ruygrok, Peter

    2015-06-01

    Risk models play an important role in stratification of patients for cardiac surgery, but their prognostic utilities for post-operative complications are rarely studied. We compared the EuroSCORE, EuroSCORE II, Society of Thoracic Surgeon's (STS) Score and an Australasian model (Aus-AVR Score) for predicting morbidities after aortic valve replacement (AVR), and also evaluated seven STS complications models in this context. We retrospectively calculated risk scores for 620 consecutive patients undergoing isolated AVR at Auckland City Hospital during 2005-2012, assessing their discrimination and calibration for post-operative complications. Amongst mortality scores, the EuroSCORE was the best at discriminating stroke (c-statistic 0.845); the EuroSCORE II at deep sternal wound infection (c=0.748); and the STS Score at composite morbidity or mortality (c=0.666), renal failure (c=0.634), ventilation>24 hours (c=0.732), return to theatre (c=0.577) and prolonged hospital stay >14 days post-operatively (c=0.707). The individual STS complications models had a marginally higher c-statistic (c=0.634-0.846) for all complications except mediastinitis, and had good calibration (Hosmer-Lemeshow test P-value 0.123-0.915) for all complications. The STS Score was best overall at discriminating post-operative complications and their composite for AVR. All STS complications models except for deep sternal wound infection had good discrimination and calibration for post-operative complications. Copyright © 2014 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  8. A Lyme borreliosis diagnosis probability score - no relation with antibiotic treatment response.

    PubMed

    Briciu, Violeta T; Flonta, Mirela; Leucuţa, Daniel; Cârstina, Dumitru; Ţăţulescu, Doina F; Lupşe, Mihaela

    2017-05-01

    (1) To describe epidemiological and clinical data of patients that present with the suspicion of Lyme borreliosis (LB); (2) to evaluate a previous published score that classifies patients on the probability of having LB, following-up patients' clinical outcome after antibiotherapy. Inclusion criteria: patients with clinical manifestations compatible with LB and Borrelia (B.) burgdorferi positive serology, hospitalized in a Romanian hospital between January 2011 and October 2012. erythema migrans (EM) or suspicion of Lyme neuroborreliosis (LNB) with lumbar puncture performed for diagnosis. A questionnaire was completed for each patient regarding associated diseases, tick bites or EM history and clinical signs/symptoms at admission, end of treatment and 3 months later. Two-tier testing (TTT) used an ELISA followed by a Western Blot kit. The patients were classified in groups, using the LB probability score and were evaluated in a multidisciplinary team. Antibiotherapy followed guidelines' recommendations. Sixty-four patients were included, presenting diverse associated comorbidities. Fifty-seven patients presented positive TTT, seven presenting either ELISA or Western Blot test positive. No differences in outcome were found between the groups of patients classified as very probable, probable and little probable LB. Instead, a better post-treatment outcome was described in patients with positive TTT. The patients investigated for the suspicion of LB present diverse clinical manifestations and comorbidities that complicate differential diagnosis. The LB diagnosis probability score used in our patients did not correlate with the antibiotic treatment response, suggesting that the probability score does not bring any benefit in diagnosis.

  9. Hybrid endovascular stent-grafting technique for patent ductus arteriosus in an adult.

    PubMed

    Kainuma, S; Kuratani, T; Sawa, Y

    2011-09-01

    A 51-year-old man was referred to our institution for patent ductus arteriosus (PDA) complicated by left ventricular dysfunction and pulmonary hypertension. Surgical closure of a PDA is usually carried out via a small posterior thoracotomy. However, thoracoscopic procedures are probably not appropriate in adults because of the frequency of calcification and the greater risk of rupture while ligating the ductus. To minimize surgical trauma, we used hybrid endovascular stent grafting combined with revascularization of the left subclavian artery, which enabled us to eliminate shunt flow to the pulmonary artery. At 11-month follow-up, the patient was asymptomatic and showed no complications. © Georg Thieme Verlag KG Stuttgart · New York.

  10. A global goodness-of-fit test for receiver operating characteristic curve analysis via the bootstrap method.

    PubMed

    Zou, Kelly H; Resnic, Frederic S; Talos, Ion-Florin; Goldberg-Zimring, Daniel; Bhagwat, Jui G; Haker, Steven J; Kikinis, Ron; Jolesz, Ferenc A; Ohno-Machado, Lucila

    2005-10-01

    Medical classification accuracy studies often yield continuous data based on predictive models for treatment outcomes. A popular method for evaluating the performance of diagnostic tests is the receiver operating characteristic (ROC) curve analysis. The main objective was to develop a global statistical hypothesis test for assessing the goodness-of-fit (GOF) for parametric ROC curves via the bootstrap. A simple log (or logit) and a more flexible Box-Cox normality transformations were applied to untransformed or transformed data from two clinical studies to predict complications following percutaneous coronary interventions (PCIs) and for image-guided neurosurgical resection results predicted by tumor volume, respectively. We compared a non-parametric with a parametric binormal estimate of the underlying ROC curve. To construct such a GOF test, we used the non-parametric and parametric areas under the curve (AUCs) as the metrics, with a resulting p value reported. In the interventional cardiology example, logit and Box-Cox transformations of the predictive probabilities led to satisfactory AUCs (AUC=0.888; p=0.78, and AUC=0.888; p=0.73, respectively), while in the brain tumor resection example, log and Box-Cox transformations of the tumor size also led to satisfactory AUCs (AUC=0.898; p=0.61, and AUC=0.899; p=0.42, respectively). In contrast, significant departures from GOF were observed without applying any transformation prior to assuming a binormal model (AUC=0.766; p=0.004, and AUC=0.831; p=0.03), respectively. In both studies the p values suggested that transformations were important to consider before applying any binormal model to estimate the AUC. Our analyses also demonstrated and confirmed the predictive values of different classifiers for determining the interventional complications following PCIs and resection outcomes in image-guided neurosurgery.

  11. Comparison of perioperative outcomes between open and robotic radical cystectomy: a population based analysis.

    PubMed

    Nazzani, Sebastiano; Mazzone, Elio; Preisser, Felix; Bandini, Marco; Tian, Zhe; Marchioni, Michele; Ratti, Dario; Motta, Gloria; Zorn, Kevin Christopher; Briganti, Alberto; Shariat, Shahrokh F; Montanari, Emanuele; Carmignani, Luca; Karakiewicz, Pierre I

    2018-05-30

    Radical cystectomy represents the standard of care for muscle invasive bladder cancer (MIBC). Due to its novelty the use of robotic radical cystectomy (RARC) is still under debate. We examined intraoperative and postoperative morbidity and mortality as well as impact on length of stay (LOS) and total hospital charges (THCGs) of RARC compared to open radical cystectomy (ORC). Within National Inpatient Sample (NIS) (2008-2013), we identified patients with non-metastatic bladder cancer treated with either ORC or RARC. We relied on inverse probability of treatment weighting (IPTW) to reduce the effect of inherent differences between ORC vs. RARC. Multivariable logistic regression (MLR) and multivariable Poisson regression models (MPR) were used. Of all 10 027 patients, 12.6% underwent RARC. Between 2008 and 2013, RARC rates increased from 0.8 to 20.4% [Estimated annual percentage change (EAPC): +26.5%, CI: +11.1 to +48.3; p=0.035] and RARC THCGs decreased from 45 981 to 31 749 United States Dollars (EAPC: -6.8%, CI: -9.6 to -3.9; p=0.01). In MLR models RARC resulted in lower rates of overall complications (OR: 0.6; p <0.001) and transfusions (OR: 0.44; p <0.001). In MPR models, RARC was associated with shorter LOS [relative risk(RR)0.91 ; p <0.001]. Finally, higher THCGs (OR: 1.09; p <0.001) were recorded for RARC. Data are retrospective and no tumor characteristics were available. RARC is related to lower rates of overall complications and transfusions rates. In consequence, RARC is a safe and feasible technique in select muscle invasive bladder cancer patients. Moreover, RARC is associated with shorter LOS albeit higher THCGs.  .

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Z; Li, B; Department of Radiation Oncology, Shandong Cancer Hospital, Shandong Academy of Medical Sciences

    Purpose: The aim of this research was to investigate the feasibility of Lyman-Kutcher-Burman (LKB) normal tissue complication probability (NTCP) model in analyzing hepatitis B virus (HBV) reactivation in patients receiving conformal radiotherapy for patients with hepatocellular carcinoma (HCC). Methods: Between June 2009 and June 2012, 108 HBV-related HCC patients (90 were specifically selected and 18 patients were excluded) treated with conformal RT at three centers were enrolled in this retrospective study. They were all diagnosed as HCC by pathology or cytology. All 90 patients were followed up to September 2013 with a median follow-up time of 25.2 months. The parametersmore » (TD50 (1), n, and m) of the modified LKB NTCP model were derived using maximum likelihood estimation. Bootstrap and leave-one-out were employed to test the generalizability of the results for use in a general population. Results: The incidences of complications in the study population were as follows: radiation-induced liver diseases (RILD) were 17.6%, HBV reactivation was 24.8%, and HBV reactivation-induced hepatitis was 22.7%, respectively. In multivariate analysis, the NTCP (p<0.001), and V20 were associated with HBV reactivation. TD50 (1), m and n were 42.9Gy (95% CI) (38.2–46.8), 0.14 (0.12–0.15) and 0.30 (0.2–0.33), respectively, for HBV reactivation. Bootstrap and leave-one-out results showed that the HBV parameter fits were extremely robust. Conclusion: A modified LKB NTCP model has been established to predict HBV reactivation for patients with HCC receiving conformal RT. The finding derives parameters set to predict potential endpoints of HBV reactivation.« less

  13. Patient- and therapy-related factors associated with the incidence of xerostomia in nasopharyngeal carcinoma patients receiving parotid-sparing helical tomotherapy.

    PubMed

    Lee, Tsair-Fwu; Liou, Ming-Hsiang; Ting, Hui-Min; Chang, Liyun; Lee, Hsiao-Yi; Wan Leung, Stephen; Huang, Chih-Jen; Chao, Pei-Ju

    2015-08-20

    We investigated the incidence of moderate to severe patient-reported xerostomia among nasopharyngeal carcinoma (NPC) patients treated with helical tomotherapy (HT) and identified patient- and therapy-related factors associated with acute and chronic xerostomia toxicity. The least absolute shrinkage and selection operator (LASSO) normal tissue complication probability (NTCP) models were developed using quality-of-life questionnaire datasets from 67 patients with NPC. For acute toxicity, the dosimetric factors of the mean doses to the ipsilateral submandibular gland (Dis) and the contralateral submandibular gland (Dcs) were selected as the first two significant predictors. For chronic toxicity, four predictive factors were selected: age, mean dose to the oral cavity (Doc), education, and T stage. The substantial sparing data can be used to avoid xerostomia toxicity. We suggest that the tolerance values corresponded to a 20% incidence of complications (TD20) for Dis = 39.0 Gy, Dcs = 38.4 Gy, and Doc = 32.5 Gy, respectively, when mean doses to the parotid glands met the QUANTEC 25 Gy sparing guidelines. To avoid patient-reported xerostomia toxicity, the mean doses to the parotid gland, submandibular gland, and oral cavity have to meet the sparing tolerance, although there is also a need to take inherent patient characteristics into consideration.

  14. White Complicity and Social Justice Education: Can One Be Culpable without Being Liable?

    ERIC Educational Resources Information Center

    Applebaum, Barbara

    2007-01-01

    In part of an ongoing study of white complicity, moral responsibility, and moral agency in social justice education, Barbara Applebaum asks in this essay what model or models of moral responsibility can help white students recognize their white complicity and which models of moral responsibility obscure such acknowledgment. To address this…

  15. Normal Tissue Complication Probability Modeling of Acute Hematologic Toxicity in Patients Treated With Intensity-Modulated Radiation Therapy for Squamous Cell Carcinoma of the Anal Canal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazan, Jose G.; Luxton, Gary; Mok, Edward C.

    2012-11-01

    Purpose: To identify dosimetric parameters that correlate with acute hematologic toxicity (HT) in patients with squamous cell carcinoma of the anal canal treated with definitive chemoradiotherapy (CRT). Methods and Materials: We analyzed 33 patients receiving CRT. Pelvic bone (PBM) was contoured for each patient and divided into subsites: ilium, lower pelvis (LP), and lumbosacral spine (LSS). The volume of each region receiving at least 5, 10, 15, 20, 30, and 40 Gy was calculated. Endpoints included grade {>=}3 HT (HT3+) and hematologic event (HE), defined as any grade {>=}2 HT with a modification in chemotherapy dose. Normal tissue complication probabilitymore » (NTCP) was evaluated with the Lyman-Kutcher-Burman (LKB) model. Logistic regression was used to test associations between HT and dosimetric/clinical parameters. Results: Nine patients experienced HT3+ and 15 patients experienced HE. Constrained optimization of the LKB model for HT3+ yielded the parameters m = 0.175, n = 1, and TD{sub 50} = 32 Gy. With this model, mean PBM doses of 25 Gy, 27.5 Gy, and 31 Gy result in a 10%, 20%, and 40% risk of HT3+, respectively. Compared with patients with mean PBM dose of <30 Gy, patients with mean PBM dose {>=}30 Gy had a 14-fold increase in the odds of developing HT3+ (p = 0.005). Several low-dose radiation parameters (i.e., PBM-V10) were associated with the development of HT3+ and HE. No association was found with the ilium, LP, or clinical factors. Conclusions: LKB modeling confirms the expectation that PBM acts like a parallel organ, implying that the mean dose to the organ is a useful predictor for toxicity. Low-dose radiation to the PBM was also associated with clinically significant HT. Keeping the mean PBM dose <22.5 Gy and <25 Gy is associated with a 5% and 10% risk of HT, respectively.« less

  16. A new way of thinking about complications of prematurity.

    PubMed

    Moore, Tiffany A; Berger, Ann M; Wilson, Margaret E

    2014-01-01

    The morbidity and mortality of preterm infants are impacted by their ability to maintain physiologic homeostasis using metabolic, endocrine, and immunologic mechanisms independent of the mother's placenta. Exploring McEwen's allostatic load model in preterm infants provides a new way to understand the altered physiologic processes associated with frequently occurring complications of prematurity such as bronchopulmonary dysplasia, intraventricular hemorrhage, necrotizing enterocolitis, and retinopathy of prematurity. The purpose of this article is to present a new model to enhance understanding of the altered physiologic processes associated with complications of prematurity. The model of allostatic load and complications of prematurity was derived to explore the relationship between general stress of prematurity and complications of prematurity. The proposed model uses the concepts of general stress of prematurity, allostasis, physiologic response patterns (adaptive-maladaptive), allostatic load, and complications of prematurity. These concepts are defined and theoretical relationships in the proposed model are interpreted using the four maladaptive response patterns of repeated hits, lack of adaptation, prolonged response, and inadequate response. Empirical evidence for cortisol, inflammation, and oxidative stress responses are used to support the theoretical relationships. The proposed model provides a new way of thinking about physiologic dysregulation in preterm infants. The ability to describe and understand complex physiologic mechanisms involved in complications of prematurity is essential for research. Advancing the knowledge of complications of prematurity will advance clinical practice and research and lead to testing of interventions to reduce negative outcomes in preterm infants.

  17. Neurologic complications after allogeneic hematopoietic stem cell transplantation in children: analysis of prognostic factors.

    PubMed

    Kang, Ji-Man; Kim, Yae-Jean; Kim, Ju Youn; Cho, Eun Joo; Lee, Jee Hun; Lee, Mun Hyang; Lee, Soo-Hyun; Sung, Ki Woong; Koo, Hong Hoe; Yoo, Keon Hee

    2015-06-01

    Neurologic complications are serious complications after hematopoietic stem cell transplantation (HSCT) and significantly contribute to morbidity and mortality. The purpose of this study was to investigate the clinical features and prognosis in pediatric patients who had neurologic complications after allogeneic HSCT. We retrospectively reviewed the medical records of children and adolescents (19 years old or younger) who underwent allogeneic HSCT at our institution from 2000 to 2012. A total of 383 patients underwent 430 allogeneic transplantations. Among them, 73 episodes of neurologic complications occurred in 70 patients. The cumulative incidence of neurologic complications at day 400 was 20.0%. Almost two thirds of the episodes (63.0%, 46 of 73) occurred within 100 days after transplantation. Calcineurin inhibitor-related neurotoxicity was observed as the most common cause of neurotoxicity (47.9%, 35 of 73) and was significantly associated with earlier onset neurologic complications, seizure, and tremor. It also showed a significant association with lower probability of headache, abnormality of cranial nerve, and neurologic sequelae. In a multivariate analysis, days to neutrophil engraftment after HSCT, extensive chronic graft-versus-host disease (GVHD) and the existence of neurologic sequelae were identified as risk factors for mortality in patients who had neurologic complications (hazard ratio [HR], 1.08; 95% confidence interval [CI], 1.02 to 1.15; P = .011; HR, 5.98; 95% CI, 1.71 to 20.90; P = .005; and HR, 4.37; 95% CI, 1.12 to 17.05; P = .034, respectively). However, there was no significant difference in the 5-year overall survival between the patients who had neurologic complications without sequelae and the patients who did not have any neurologic complications (57.3% versus 61.8%, P = .906). In conclusion, we found that the major significant risk factors for mortality in pediatric recipients with neurologic complications were the existence of neurologic sequelae and extensive chronic GVHD. Copyright © 2015 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.

  18. Estimation of parameters of dose volume models and their confidence limits

    NASA Astrophysics Data System (ADS)

    van Luijk, P.; Delvigne, T. C.; Schilstra, C.; Schippers, J. M.

    2003-07-01

    Predictions of the normal-tissue complication probability (NTCP) for the ranking of treatment plans are based on fits of dose-volume models to clinical and/or experimental data. In the literature several different fit methods are used. In this work frequently used methods and techniques to fit NTCP models to dose response data for establishing dose-volume effects, are discussed. The techniques are tested for their usability with dose-volume data and NTCP models. Different methods to estimate the confidence intervals of the model parameters are part of this study. From a critical-volume (CV) model with biologically realistic parameters a primary dataset was generated, serving as the reference for this study and describable by the NTCP model. The CV model was fitted to this dataset. From the resulting parameters and the CV model, 1000 secondary datasets were generated by Monte Carlo simulation. All secondary datasets were fitted to obtain 1000 parameter sets of the CV model. Thus the 'real' spread in fit results due to statistical spreading in the data is obtained and has been compared with estimates of the confidence intervals obtained by different methods applied to the primary dataset. The confidence limits of the parameters of one dataset were estimated using the methods, employing the covariance matrix, the jackknife method and directly from the likelihood landscape. These results were compared with the spread of the parameters, obtained from the secondary parameter sets. For the estimation of confidence intervals on NTCP predictions, three methods were tested. Firstly, propagation of errors using the covariance matrix was used. Secondly, the meaning of the width of a bundle of curves that resulted from parameters that were within the one standard deviation region in the likelihood space was investigated. Thirdly, many parameter sets and their likelihood were used to create a likelihood-weighted probability distribution of the NTCP. It is concluded that for the type of dose response data used here, only a full likelihood analysis will produce reliable results. The often-used approximations, such as the usage of the covariance matrix, produce inconsistent confidence limits on both the parameter sets and the resulting NTCP values.

  19. Comparison of treatment plans: a retrospective study by the method of radiobiological evaluation

    NASA Astrophysics Data System (ADS)

    Puzhakkal, Niyas; Kallikuzhiyil Kochunny, Abdullah; Manthala Padannayil, Noufal; Singh, Navin; Elavan Chalil, Jumanath; Kulangarakath Umer, Jamshad

    2016-09-01

    There are many situations in radiotherapy where multiple treatment plans need to be compared for selection of an optimal plan. In this study we performed the radiobiological method of plan evaluation to verify the treatment plan comparison procedure of our clinical practice. We estimated and correlated various radiobiological dose indices with physical dose metrics for a total of 30 patients representing typical cases of head and neck, prostate and brain tumors. Three sets of plans along with a clinically approved plan (final plan) treated by either Intensity Modulated Radiation Therapy (IMRT) or Rapid Arc (RA) techniques were considered. The study yielded improved target coverage for final plans, however, no appreciable differences in doses and the complication probabilities of organs at risk were noticed. Even though all four plans showed adequate dose distributions, from dosimetric point of view, the final plan had more acceptable dose distribution. The estimated biological outcome and dose volume histogram data showed least differences between plans for IMRT when compared to RA. Our retrospective study based on 120 plans, validated the radiobiological method of plan evaluation. The tumor cure or normal tissue complication probabilities were found to be correlated with the corresponding physical dose indices.

  20. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling

    PubMed Central

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision. PMID:27303323

  1. A randomized controlled trial of peeling and aspiration of Elschnig pearls and neodymium: yttrium-aluminium-garnet laser capsulotomy.

    PubMed

    Bhargava, Rahul; Kumar, Prachi; Sharma, Shiv Kumar; Kaur, Avinash

    2015-01-01

    To compare surgical peeling and aspiration and neodymium yttrium garnet laser capsulotomy for pearl form of posterior capsule opacification (PCO). A prospective, randomized, double blind, study was done at Rotary Eye Hospital, Maranda, Palampur, India, Santosh Medical College Hospital, Ghaziabad, India and Laser Eye Clinic, Noida India. Consecutive patients with pearl form of PCO following surgery, phacoemulsification, manual small incision cataract surgery and conventional extracapsular cataract extraction (ECCE) for age related cataract, were randomized to have peeling and aspiration or neodymium yttrium garnet laser capsulotomy. Corrected distance visual acuity (CDVA), intra-operative and post-operative complications were compared. A total of 634 patients participated in the study, and 314 (49.5%) patients were randomized to surgical peeling and aspiration group and 320 (50.5%) to the Nd:YAG laser group. The mean pre-procedural logMAR CDVA in peeling and neodymium: yttrium-aluminium-garnet (Nd:YAG) laser group was 0.80±0.25 and 0.86±0.22, respectively. The mean final CDVA in peeling group (0.22±0.23) was comparable to Nd:YAG group (0.24±0.28; t test, P=0.240). There was a significant improvement in vision after both the procedures (P<0.001). A slightly higher percentage of patients in Nd:YAG laser group (283/88.3%) than in peeling group (262/83.4%) had a CDVA of 0.5 (20/63) or better at 9mo (P<0.001). On the contrary, patients having CDVA worse than 1.00 (20/200) was also significantly higher in Nd:YAG laser group as compared to peeling group (25/7.7% vs 15/4.7%, respectively). On application of ANCOVA, there was less than 0.001% risk that PCO thickness and total laser energy had no effect on rate of complications in Nd:YAG laser group and less than 0.001 % risk that PCO thickness had no effect on complications in peeling group respectively. Sum of square analysis suggests that in the Nd:YAG laser group, thick PCO had a stronger impact on complications (Fischer test probability, Pr<0.0001) than thin PCO and total laser energy (Fischer test probability, Pr<0.002), respectively; similarly, in peeling group, thick PCO and preoperative vision had a stronger effect on complications than thin PCO, respectively (Fischer test probability, Pr<0.001).The rate of complications like uveitis (P=0.527) and cystoid macular edema (P=0.068), did not differ significantly between both the groups. However, intraocular pressure spikes (P=0.046) and retinal detachment (P<0.001) were significantly higher in Nd:YAG laser group as compared to peeling group. Retinal detachment was more common in patients having degenerative myopia (7/87.5%, P<0.001). Recurrence of pearls was the most common cause of reduction of vision in the peeling group (24/7.6%, P<0.001). There is no alternative to Nd:YAG laser capsulotomy for fibrous subtype of PCO. For pearl form of PCO, both techniques are comparable with regard to visual outcomes. Nd:YAG laser capsulotomy has a higher incidence of IOP spikes and retinal detachment whereas recurrence of pearls may occur after successful peeling and aspiration. When posterior capsulotomy is needed in patients with retinal degenerations, retinopathies and pre-existing retinal breaks, the clinician should be cautious about increased risks of possible complications of Nd:YAG laser capsulotomy.

  2. A randomized controlled trial of peeling and aspiration of Elschnig pearls and neodymium: yttrium-aluminium-garnet laser capsulotomy

    PubMed Central

    Bhargava, Rahul; Kumar, Prachi; Sharma, Shiv Kumar; Kaur, Avinash

    2015-01-01

    AIM To compare surgical peeling and aspiration and neodymium yttrium garnet laser capsulotomy for pearl form of posterior capsule opacification (PCO). METHODS A prospective, randomized, double blind, study was done at Rotary Eye Hospital, Maranda, Palampur, India, Santosh Medical College Hospital, Ghaziabad, India and Laser Eye Clinic, Noida India. Consecutive patients with pearl form of PCO following surgery, phacoemulsification, manual small incision cataract surgery and conventional extracapsular cataract extraction (ECCE) for age related cataract, were randomized to have peeling and aspiration or neodymium yttrium garnet laser capsulotomy. Corrected distance visual acuity (CDVA), intra-operative and post-operative complications were compared. RESULTS A total of 634 patients participated in the study, and 314 (49.5%) patients were randomized to surgical peeling and aspiration group and 320 (50.5%) to the Nd:YAG laser group. The mean pre-procedural logMAR CDVA in peeling and neodymium: yttrium-aluminium-garnet (Nd:YAG) laser group was 0.80±0.25 and 0.86±0.22, respectively. The mean final CDVA in peeling group (0.22±0.23) was comparable to Nd:YAG group (0.24±0.28; t test, P=0.240). There was a significant improvement in vision after both the procedures (P<0.001). A slightly higher percentage of patients in Nd:YAG laser group (283/88.3%) than in peeling group (262/83.4%) had a CDVA of 0.5 (20/63) or better at 9mo (P<0.001). On the contrary, patients having CDVA worse than 1.00 (20/200) was also significantly higher in Nd:YAG laser group as compared to peeling group (25/7.7% vs 15/4.7%, respectively). On application of ANCOVA, there was less than 0.001% risk that PCO thickness and total laser energy had no effect on rate of complications in Nd:YAG laser group and less than 0.001 % risk that PCO thickness had no effect on complications in peeling group respectively. Sum of square analysis suggests that in the Nd:YAG laser group, thick PCO had a stronger impact on complications (Fischer test probability, Pr<0.0001) than thin PCO and total laser energy (Fischer test probability, Pr<0.002), respectively; similarly, in peeling group, thick PCO and preoperative vision had a stronger effect on complications than thin PCO, respectively (Fischer test probability, Pr<0.001).The rate of complications like uveitis (P=0.527) and cystoid macular edema (P=0.068), did not differ significantly between both the groups. However, intraocular pressure spikes (P=0.046) and retinal detachment (P<0.001) were significantly higher in Nd:YAG laser group as compared to peeling group. Retinal detachment was more common in patients having degenerative myopia (7/87.5%, P<0.001). Recurrence of pearls was the most common cause of reduction of vision in the peeling group (24/7.6%, P<0.001). CONCLUSION There is no alternative to Nd:YAG laser capsulotomy for fibrous subtype of PCO. For pearl form of PCO, both techniques are comparable with regard to visual outcomes. Nd:YAG laser capsulotomy has a higher incidence of IOP spikes and retinal detachment whereas recurrence of pearls may occur after successful peeling and aspiration. When posterior capsulotomy is needed in patients with retinal degenerations, retinopathies and pre-existing retinal breaks, the clinician should be cautious about increased risks of possible complications of Nd:YAG laser capsulotomy. PMID:26086014

  3. p-adic stochastic hidden variable model

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrew

    1998-03-01

    We propose stochastic hidden variables model in which hidden variables have a p-adic probability distribution ρ(λ) and at the same time conditional probabilistic distributions P(U,λ), U=A,A',B,B', are ordinary probabilities defined on the basis of the Kolmogorov measure-theoretical axiomatics. A frequency definition of p-adic probability is quite similar to the ordinary frequency definition of probability. p-adic frequency probability is defined as the limit of relative frequencies νn but in the p-adic metric. We study a model with p-adic stochastics on the level of the hidden variables description. But, of course, responses of macroapparatuses have to be described by ordinary stochastics. Thus our model describes a mixture of p-adic stochastics of the microworld and ordinary stochastics of macroapparatuses. In this model probabilities for physical observables are the ordinary probabilities. At the same time Bell's inequality is violated.

  4. Deep Brain Stimulation for Parkinson’s Disease with Early Motor Complications: A UK Cost-Effectiveness Analysis

    PubMed Central

    Fundament, Tomasz; Eldridge, Paul R.; Green, Alexander L.; Whone, Alan L.; Taylor, Rod S.; Williams, Adrian C.; Schuepbach, W. M. Michael

    2016-01-01

    Background Parkinson’s disease (PD) is a debilitating illness associated with considerable impairment of quality of life and substantial costs to health care systems. Deep brain stimulation (DBS) is an established surgical treatment option for some patients with advanced PD. The EARLYSTIM trial has recently demonstrated its clinical benefit also in patients with early motor complications. We sought to evaluate the cost-effectiveness of DBS, compared to best medical therapy (BMT), among PD patients with early onset of motor complications, from a United Kingdom (UK) payer perspective. Methods We developed a Markov model to represent the progression of PD as rated using the Unified Parkinson's Disease Rating Scale (UPDRS) over time in patients with early PD. Evidence sources were a systematic review of clinical evidence; data from the EARLYSTIM study; and a UK Clinical Practice Research Datalink (CPRD) dataset including DBS patients. A mapping algorithm was developed to generate utility values based on UPDRS data for each intervention. The cost-effectiveness was expressed as the incremental cost per quality-adjusted life-year (QALY). One-way and probabilistic sensitivity analyses were undertaken to explore the effect of parameter uncertainty. Results Over a 15-year time horizon, DBS was predicted to lead to additional mean cost per patient of £26,799 compared with BMT (£73,077/patient versus £46,278/patient) and an additional mean 1.35 QALYs (6.69 QALYs versus 5.35 QALYs), resulting in an incremental cost-effectiveness ratio of £19,887 per QALY gained with a 99% probability of DBS being cost-effective at a threshold of £30,000/QALY. One-way sensitivity analyses suggested that the results were not significantly impacted by plausible changes in the input parameter values. Conclusion These results indicate that DBS is a cost-effective intervention in PD patients with early motor complications when compared with existing interventions, offering additional health benefits at acceptable incremental cost. This supports the extended use of DBS among patients with early onset of motor complications. PMID:27441637

  5. Evaluation of a mixed beam therapy for post-mastectomy breast cancer patients: bolus electron conformal therapy combined with intensity modulated photon radiotherapy and volumetric modulated photon arc therapy.

    PubMed

    Zhang, Rui; Heins, David; Sanders, Mary; Guo, Beibei; Hogstrom, Kenneth

    2018-05-10

    The purpose of this study was to assess the potential benefits and limitations of a mixed beam therapy, which combined bolus electron conformal therapy (BECT) with intensity modulated photon radiotherapy (IMRT) and volumetric modulated photon arc therapy (VMAT), for left-sided post-mastectomy breast cancer patients. Mixed beam treatment plans were produced for nine post-mastectomy radiotherapy (PMRT) patients previously treated at our clinic with VMAT alone. The mixed beam plans consisted of 40 Gy to the chest wall area using BECT, 40 Gy to the supraclavicular area using parallel opposed IMRT, and 10 Gy to the total planning target volume (PTV) by optimizing VMAT on top of the BECT+IMRT dose distribution. The treatment plans were created in a commercial treatment planning system (TPS), and all plans were evaluated based on PTV coverage, dose homogeneity index (DHI), conformity index (CI), dose to organs at risk (OARs), normal tissue complication probability (NTCP), and secondary cancer complication probability (SCCP). The standard VMAT alone planning technique was used as the reference for comparison. Both techniques produced clinically acceptable PMRT plans but with a few significant differences: VMAT showed significantly better CI (0.70 vs. 0.53, p < 0.001) and DHI (0.12 vs. 0.20, p < 0.001) over mixed beam therapy. For normal tissues, mixed beam therapy showed better OAR sparing and significantly reduced NTCP for cardiac mortality (0.23% vs. 0.80%, p = 0.01) and SCCP for contralateral breast (1.7% vs. 3.1% based on linear model, and 1.2% vs. 1.9% based on linear-exponential model, p < 0.001 in both cases), but showed significantly higher mean (50.8 Gy vs. 49.3 Gy, p < 0.001) and maximum skin doses (59.7 Gy vs. 53.3 Gy, p < 0.001) compared with VMAT. Patients with more tissue (minimum distance between the distal PTV surface and lung approximately > 0.5 cm and volume of tissue between the distal PTV surface and heart or lung approximately > 250 cm 3 ) between distal PTV surface and lung may benefit the most from mixed beam therapy. This work has demonstrated that mixed beam therapy (BECT+IMRT : VMAT = 4 : 1) produces clinically acceptable plans having reduced OAR doses and risks of side effects compared with VMAT. Even though VMAT alone produces more homogenous and conformal dose distributions, mixed beam therapy remains as a viable option for treating post-mastectomy patients, possibly leading to reduced normal tissue complications. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  6. Radiobiological Determination of Dose Escalation and Normal Tissue Toxicity in Definitive Chemoradiation Therapy for Esophageal Cancer☆

    PubMed Central

    Warren, Samantha; Partridge, Mike; Carrington, Rhys; Hurt, Chris; Crosby, Thomas; Hawkins, Maria A.

    2014-01-01

    Purpose This study investigated the trade-off in tumor coverage and organ-at-risk sparing when applying dose escalation for concurrent chemoradiation therapy (CRT) of mid-esophageal cancer, using radiobiological modeling to estimate local control and normal tissue toxicity. Methods and Materials Twenty-one patients with mid-esophageal cancer were selected from the SCOPE1 database (International Standard Randomised Controlled Trials number 47718479), with a mean planning target volume (PTV) of 327 cm3. A boost volume, PTV2 (GTV + 0.5 cm margin), was created. Radiobiological modeling of tumor control probability (TCP) estimated the dose required for a clinically significant (+20%) increase in local control as 62.5 Gy/25 fractions. A RapidArc (RA) plan with a simultaneously integrated boost (SIB) to PTV2 (RA62.5) was compared to a standard dose plan of 50 Gy/25 fractions (RA50). Dose-volume metrics and estimates of normal tissue complication probability (NTCP) for heart and lungs were compared. Results Clinically acceptable dose escalation was feasible for 16 of 21 patients, with significant gains (>18%) in tumor control from 38.2% (RA50) to 56.3% (RA62.5), and only a small increase in predicted toxicity: median heart NTCP 4.4% (RA50) versus 5.6% (RA62.5) P<.001 and median lung NTCP 6.5% (RA50) versus 7.5% (RA62.5) P<.001. Conclusions Dose escalation to the GTV to improve local control is possible when overlap between PTV and organ-at-risk (<8% heart volume and <2.5% lung volume overlap for this study) generates only negligible increase in lung or heart toxicity. These predictions from radiobiological modeling should be tested in future clinical trials. PMID:25304796

  7. Radiobiological evaluation of simultaneously dose-escalated versus non-escalated intensity-modulated radiation therapy for patients with upper thoracic esophageal cancer.

    PubMed

    Huang, Bao-Tian; Wu, Li-Li; Guo, Long-Jia; Xu, Liang-Yu; Huang, Rui-Hong; Lin, Pei-Xian; Chen, Jian-Zhou; Li, De-Rui; Chen, Chuang-Zhen

    2017-01-01

    To compare the radiobiological response between simultaneously dose-escalated and non-escalated intensity-modulated radiation therapy (DE-IMRT and NE-IMRT) for patients with upper thoracic esophageal cancer (UTEC) using radiobiological evaluation. Computed tomography simulation data sets for 25 patients pathologically diagnosed with primary UTEC were used in this study. DE-IMRT plan with an escalated dose of 64.8 Gy/28 fractions to the gross tumor volume (GTV) and involved lymph nodes from 25 patients pathologically diagnosed with primary UTEC, was compared to an NE-IMRT plan of 50.4 Gy/28 fractions. Dose-volume metrics, tumor control probability (TCP), and normal tissue complication probability for the lung and spinal cord were compared. In addition, the risk of acute esophageal toxicity (AET) and late esophageal toxicity (LET) were also analyzed. Compared with NE-IMRT plan, we found the DE-IMRT plan resulted in a 14.6 Gy dose escalation to the GTV. The tumor control was predicted to increase by 31.8%, 39.1%, and 40.9% for three independent TCP models. The predicted incidence of radiation pneumonitis was similar (3.9% versus 3.6%), and the estimated risk of radiation-induced spinal cord injury was extremely low (<0.13%) in both groups. Regarding the esophageal toxicities, the estimated grade ≥2 and grade ≥3 AET predicted by the Kwint model were increased by 2.5% and 3.8%. Grade ≥2 AET predicted using the Wijsman model was increased by 14.9%. The predicted incidence of LET was low (<0.51%) in both groups. Radiobiological evaluation reveals that the DE-IMRT dosing strategy is feasible for patients with UTEC, with significant gains in tumor control and minor or clinically acceptable increases in radiation-induced toxicities.

  8. Failure-to-rescue after injury is associated with preventability: The results of mortality panel review of failure-to-rescue cases in trauma.

    PubMed

    Kuo, Lindsay E; Kaufman, Elinore; Hoffman, Rebecca L; Pascual, Jose L; Martin, Niels D; Kelz, Rachel R; Holena, Daniel N

    2017-03-01

    Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center's ability to successfully "rescue" patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. All adjudications from a mortality review panel at an academic level I trauma center from 2005-2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47-3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30-66.71) judgment. Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Failure-to-rescue after injury is associated with preventability: The results of mortality panel review of failure-to-rescue cases in trauma

    PubMed Central

    Kuo, Lindsay E.; Kaufman, Elinore; Hoffman, Rebecca L.; Pascual, Jose L.; Martin, Niels D.; Kelz, Rachel R.; Holena, Daniel N.

    2018-01-01

    Background Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center’s ability to successfully “rescue” patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. Methods All adjudications from a mortality review panel at an academic level I trauma center from 2005–2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Results Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47–3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30–66.71) judgment. Conclusion Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. PMID:27788924

  10. Interactions of allelic variance of PNPLA3 with nongenetic factors in predicting nonalcoholic steatohepatitis and nonhepatic complications of severe obesity.

    PubMed

    Guichelaar, M M J; Gawrieh, S; Olivier, M; Viker, K; Krishnan, A; Sanderson, S; Malinchoc, M; Watt, K D; Swain, J M; Sarr, M; Charlton, M R

    2013-09-01

    Allelic variation (rs738409C→G) in adiponutrin (patatin-like phospholipase domain-containing protein 3, PNPLA3) has been associated with hepatic steatosis and liver fibrosis. The physiologic impact of the PNPLA3 G allele may be exacerbated in patients with severe obesity. In this study, we investigated the interactions of PNPLA3 rs738409 with a broad panel of metabolic and histologic characteristics of nonalcoholic fatty liver disease and nonalcoholic steatohepatitis (NASH) in patients with medically complicated obesity. Consecutive patients undergoing bariatric surgery were selected for a prospective study. They underwent extensive laboratory and histologic (liver biopsy) assessment, as well as evaluation of rs738409 polymorphism by TaqMan assay. Only 12 (8.3%) of the 144 patients had normal liver histology, with 72 (50%) NASH, of whom 15 (10.4% of total patients) had fibrosis stage 2-3. PNPLA3 GG genotype correlated positively (P < 0.05) with serum levels of alanine aminotransferase (ALT), asparate aminotransferase (AST), glucose, fibrinogen, and insulin-dependent diabetes mellitus, homeostasis model assessment-insulin resistance, and presence of NASH. Multivariate analysis indicated that PNPLA3 rs738409 G versus C allele remained an (independent) risk factor for NASH, in addition to CK-18 >145 IU/l, glucose >100 mg/dl, and C-reactive protein (CRP) >0.8 mg/dl. The probability of NASH increased from 9% (no risk factor) to 82% if all four risk factors were present. In this cohort of patients with medically complicated obesity, PNPLA3 rs738409 G allelic expression is associated with hepatic (NASH) and nonhepatic complications of obesity, such as insulin resistance. These novel findings may be related to a greater impact of PNPLA3 variant in magnitude and scope in patients with severe obesity than in less obese populations. Further studies are needed to characterize the nature of these associations. Copyright © 2013 The Obesity Society.

  11. Meta-analysis with missing study-level sample variance data.

    PubMed

    Chowdhry, Amit K; Dworkin, Robert H; McDermott, Michael P

    2016-07-30

    We consider a study-level meta-analysis with a normally distributed outcome variable and possibly unequal study-level variances, where the object of inference is the difference in means between a treatment and control group. A common complication in such an analysis is missing sample variances for some studies. A frequently used approach is to impute the weighted (by sample size) mean of the observed variances (mean imputation). Another approach is to include only those studies with variances reported (complete case analysis). Both mean imputation and complete case analysis are only valid under the missing-completely-at-random assumption, and even then the inverse variance weights produced are not necessarily optimal. We propose a multiple imputation method employing gamma meta-regression to impute the missing sample variances. Our method takes advantage of study-level covariates that may be used to provide information about the missing data. Through simulation studies, we show that multiple imputation, when the imputation model is correctly specified, is superior to competing methods in terms of confidence interval coverage probability and type I error probability when testing a specified group difference. Finally, we describe a similar approach to handling missing variances in cross-over studies. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    NASA Astrophysics Data System (ADS)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  13. Directed Research in Bone Discipline: Refining Previous Research Observations for Space Medicine

    NASA Technical Reports Server (NTRS)

    Sibonga, Jean D.

    2015-01-01

    Dual-energy X-ray absorptiometry bone mass density, as a sole index, is an insufficient surrogate for fracture; Clinical Practice Guidelines using bone mass density (both World Health Organization and FRAX) are not specific for complicated subjects such as young, healthy persons following prolonged exposure to skeletal unloading (i.e. an attribute of spaceflight); Research data suggest that spaceflight induces changes to astronaut bones that could be profound, possibly irreversible and unlike age-related bone loss on Earth.; There is a need to objectively assess factors across human physiology that are also influenced by spaceflight (e.g., muscle) that contribute to fracture risk. Some of these objective assessments may require innovative technologies, analyses and modeling.; Astronauts are also exposed to novel situations that may overload their bones highlighting a need integrate biomechanics of physical activities into risk assessments.; As we accumulate data, which reflects the biomechanical competence of bone under specific mechanically-loaded scenarios (even activities of daily living), BONE expects Bone Fracture Module to be more sensitive and/or have less uncertainty in its assessments of fracture probability.; Fracture probability drives the requirement for countermeasures. Level of evidence will unlikely be obtained; hence, the Bone Research and Clinical Advisory Panel (like a Data Safety Monitoring Board) will provide the recommendations.

  14. Inference of timber harvest effects on survival of stream amphibians is complicated by movement

    USGS Publications Warehouse

    Chelgren, Nathan; Adams, Michael J.

    2017-01-01

    The effects of contemporary logging practices on headwater stream amphibians have received considerable study but with conflicting or ambiguous results. We posit that focusing inference on demographic rates of aquatic life stages may help refine understanding, as aquatic and terrestrial impacts may differ considerably. We investigated in-stream survival and movement of two stream-breeding amphibian species within a before-after timber harvest experiment in the Oregon Coast Range. We used recaptures of marked individuals and a joint probability model of survival, movement, and capture probability, to measure variation in these rates attributed to stream reach, stream gradient, pre- and post-harvest periods, and the timber harvest intensity. Downstream biased movement occurred in both species but was greater for Coastal Tailed Frog (Ascaphus truei) larvae than aquatic Coastal Giant Salamanders (Dicamptodon tenebrosus). For D. tenebrosus, downstream biased movement occurred early in life, soon after an individual's first summer. Increasing timber harvest intensity reduced downstream movement bias and reduced survival of D. tenebrosus, but neither of these effects were detected for larvae of A. truei. Our findings provide insight into the demographic mechanisms underlying previous nuanced studies of amphibian responses to timber harvest based on biomass or counts of larvae.

  15. Scintigraphy for pulmonary capillary protein leak. Final report, 1 October 1981-30 September 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugerman, H.J.; Tatum, J.L.; Hirsch, J.I.

    1986-06-01

    Computerized scintigraphy, employing the gamma camera, has been used to study the dynamics of the pulmonary capillary membrane leak of 99m-technetium-tagged human serum albumin (Tc-HSA). In preliminary canine studies, the severity of an oleic acid-induced albumin leak was proportional to the slope of lung: heart radioactivity ratio and was more sensitive than arterial blood gases or standard chest roentgenograms. This rising ratio is called slope of injury slope index. A number of agents were studied in an attempt to prevent oleic acid-induced pulmonary microvascular injury. Following a series of five control dogs, five dogs each were studied with each ofmore » the following agents: methylprednisolone, ibuprofen, the superoxide radical scavenger, MK-44, and, in three dogs, calcium gluconate. None of these agents was able to alter the rise in lung: heart radioactivity ratio following oleic acid injury. A septic pig model was developed for study of bacterially induced ARDS. Septic-induced ARDS and multi-system organ failure are probably secondary to the systemic release of several mediators of inflammation, treatment will probably require a combination of anti-inflammatory agents. This should impact significantly on the mortality and morbidity of septic complications in traumatized combat soldiers.« less

  16. Pharmacologic Hemostatic Agents in Total Joint Arthroplasty-A Cost-Effectiveness Analysis.

    PubMed

    Ramkumar, Dipak B; Ramkumar, Niveditta; Tapp, Stephanie J; Moschetti, Wayne E

    2018-03-03

    Total knee and hip arthroplasties can be associated with substantial blood loss, affecting morbidity and even mortality. Two pharmacological antifibrinolytics, ε-aminocaproic acid (EACA) and tranexamic acid (TXA) have been used to minimize perioperative blood loss, but both have associated morbidity. Given the added cost of these medications and the risks associated with then, a cost-effectiveness analysis was undertaken to ascertain the best strategy. A cost-effectiveness model was constructed using the payoffs of cost (in United States dollars) and effectiveness (quality-adjusted life expectancy, in days). The medical literature was used to ascertain various complications, their probabilities, utility values, and direct medical costs associated with various health states. A time horizon of 10 years and a willingness to pay threshold of $100,000 was used. The total cost and effectiveness (quality-adjusted life expectancy, in days) was $459.77, $951.22, and $1174.87 and 3411.19, 3248.02, and 3342.69 for TXA, no pharmacologic hemostatic agent, and EACA, respectively. Because TXA is less expensive and more effective than the competing alternatives, it was the favored strategy. One-way sensitivity analyses for probability of transfusion and myocardial infarction for all 3 strategies revealed that TXA remains the dominant strategy across all clinically plausible values. TXA, when compared with no pharmacologic hemostatic agent and with EACA, is the most cost-effective strategy to minimize intraoperative blood loss in hip and knee total joint arthroplasties. These findings are robust to sensitivity analyses using clinically plausible probabilities. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Preoperative paravertebral blocks for the management of acute pain following mastectomy: a cost-effectiveness analysis.

    PubMed

    Offodile, Anaeze C; Sheckter, Clifford C; Tucker, Austin; Watzker, Anna; Ottino, Kevin; Zammert, Martin; Padula, William V

    2017-10-01

    Preoperative paravertebral blocks (PPVBs) are routinely used for treating post-mastectomy pain, yet uncertainties remain about the cost-effectiveness of this modality. We aim to evaluate the cost-effectiveness of PPVBs at common willingness-to-pay (WTP) thresholds. A decision analytic model compared two strategies: general anesthesia (GA) alone versus GA with multilevel PPVB. For the GA plus PPVB limb, patients were subjected to successful block placement versus varying severity of complications based on literature-derived probabilities. The need for rescue pain medication was the terminal node for all postoperative scenarios. Patient-reported pain scores sourced from published meta-analyses measured treatment effectiveness. Costing was derived from wholesale acquisition costs, the Medicare fee schedule, and publicly available hospital charge masters. Charges were converted to costs and adjusted for 2016 US dollars. A commercial payer perspective was adopted. Incremental cost-effectiveness ratios (ICERs) were evaluated against WTP thresholds of $500 and $50,000 for postoperative pain control. The ICER for preoperative paravertebral blocks was $154.49 per point reduction in pain score. 15% variation in inpatient costs resulted in ICER values ranging from $124.40-$180.66 per pain point score reduction. Altering the probability of block success by 5% generated ICER values of $144.71-$163.81 per pain score reduction. Probabilistic sensitivity analysis yielded cost-effective trials 69.43% of the time at $500 WTP thresholds. Over a broad range of probabilities, PPVB in mastectomy reduces postoperative pain at an acceptable incremental cost compared to GA. Commercial payers should be persuaded to reimburse this technique based on convincing evidence of cost-effectiveness.

  18. Population variability complicates the accurate detection of climate change responses.

    PubMed

    McCain, Christy; Szewczyk, Tim; Bracy Knight, Kevin

    2016-06-01

    The rush to assess species' responses to anthropogenic climate change (CC) has underestimated the importance of interannual population variability (PV). Researchers assume sampling rigor alone will lead to an accurate detection of response regardless of the underlying population fluctuations of the species under consideration. Using population simulations across a realistic, empirically based gradient in PV, we show that moderate to high PV can lead to opposite and biased conclusions about CC responses. Between pre- and post-CC sampling bouts of modeled populations as in resurvey studies, there is: (i) A 50% probability of erroneously detecting the opposite trend in population abundance change and nearly zero probability of detecting no change. (ii) Across multiple years of sampling, it is nearly impossible to accurately detect any directional shift in population sizes with even moderate PV. (iii) There is up to 50% probability of detecting a population extirpation when the species is present, but in very low natural abundances. (iv) Under scenarios of moderate to high PV across a species' range or at the range edges, there is a bias toward erroneous detection of range shifts or contractions. Essentially, the frequency and magnitude of population peaks and troughs greatly impact the accuracy of our CC response measurements. Species with moderate to high PV (many small vertebrates, invertebrates, and annual plants) may be inaccurate 'canaries in the coal mine' for CC without pertinent demographic analyses and additional repeat sampling. Variation in PV may explain some idiosyncrasies in CC responses detected so far and urgently needs more careful consideration in design and analysis of CC responses. © 2016 John Wiley & Sons Ltd.

  19. Establishment method of a mixture model and its practical application for transmission gears in an engineering vehicle

    NASA Astrophysics Data System (ADS)

    Wang, Jixin; Wang, Zhenyu; Yu, Xiangjun; Yao, Mingyao; Yao, Zongwei; Zhang, Erping

    2012-09-01

    Highly versatile machines, such as wheel loaders, forklifts, and mining haulers, are subject to many kinds of working conditions, as well as indefinite factors that lead to the complexity of the load. The load probability distribution function (PDF) of transmission gears has many distributions centers; thus, its PDF cannot be well represented by just a single-peak function. For the purpose of representing the distribution characteristics of the complicated phenomenon accurately, this paper proposes a novel method to establish a mixture model. Based on linear regression models and correlation coefficients, the proposed method can be used to automatically select the best-fitting function in the mixture model. Coefficient of determination, the mean square error, and the maximum deviation are chosen and then used as judging criteria to describe the fitting precision between the theoretical distribution and the corresponding histogram of the available load data. The applicability of this modeling method is illustrated by the field testing data of a wheel loader. Meanwhile, the load spectra based on the mixture model are compiled. The comparison results show that the mixture model is more suitable for the description of the load-distribution characteristics. The proposed research improves the flexibility and intelligence of modeling, reduces the statistical error and enhances the fitting accuracy, and the load spectra complied by this method can better reflect the actual load characteristic of the gear component.

  20. Fit to predict? Eco-informatics for predicting the catchability of a pelagic fish in near real time.

    PubMed

    Scales, Kylie L; Hazen, Elliott L; Maxwell, Sara M; Dewar, Heidi; Kohin, Suzanne; Jacox, Michael G; Edwards, Christopher A; Briscoe, Dana K; Crowder, Larry B; Lewison, Rebecca L; Bograd, Steven J

    2017-12-01

    The ocean is a dynamic environment inhabited by a diverse array of highly migratory species, many of which are under direct exploitation in targeted fisheries. The timescales of variability in the marine realm coupled with the extreme mobility of ocean-wandering species such as tuna and billfish complicates fisheries management. Developing eco-informatics solutions that allow for near real-time prediction of the distributions of highly mobile marine species is an important step towards the maturation of dynamic ocean management and ecological forecasting. Using 25 yr (1990-2014) of NOAA fisheries' observer data from the California drift gillnet fishery, we model relative probability of occurrence (presence-absence) and catchability (total catch per gillnet set) of broadbill swordfish Xiphias gladius in the California Current System. Using freely available environmental data sets and open source software, we explore the physical drivers of regional swordfish distribution. Comparing models built upon remotely sensed data sets with those built upon a data-assimilative configuration of the Regional Ocean Modelling System (ROMS), we explore trade-offs in model construction, and address how physical data can affect predictive performance and operational capacity. Swordfish catchability was found to be highest in deeper waters (>1,500 m) with surface temperatures in the 14-20°C range, isothermal layer depth (ILD) of 20-40 m, positive sea surface height (SSH) anomalies, and during the new moon (<20% lunar illumination). We observed a greater influence of mesoscale variability (SSH, wind speed, isothermal layer depth, eddy kinetic energy) in driving swordfish catchability (total catch) than was evident in predicting the relative probability of presence (presence-absence), confirming the utility of generating spatiotemporally dynamic predictions. Data-assimilative ROMS circumvent the limitations of satellite remote sensing in providing physical data fields for species distribution models (e.g., cloud cover, variable resolution, subsurface data), and facilitate broad-scale prediction of dynamic species distributions in near real time. © 2017 by the Ecological Society of America.

  1. Bayesian Recurrent Neural Network for Language Modeling.

    PubMed

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  2. Validation of automatic segmentation of ribs for NTCP modeling.

    PubMed

    Stam, Barbara; Peulen, Heike; Rossi, Maddalena M G; Belderbos, José S A; Sonke, Jan-Jakob

    2016-03-01

    Determination of a dose-effect relation for rib fractures in a large patient group has been limited by the time consuming manual delineation of ribs. Automatic segmentation could facilitate such an analysis. We determine the accuracy of automatic rib segmentation in the context of normal tissue complication probability modeling (NTCP). Forty-one patients with stage I/II non-small cell lung cancer treated with SBRT to 54 Gy in 3 fractions were selected. Using the 4DCT derived mid-ventilation planning CT, all ribs were manually contoured and automatically segmented. Accuracy of segmentation was assessed using volumetric, shape and dosimetric measures. Manual and automatic dosimetric parameters Dx and EUD were tested for equivalence using the Two One-Sided T-test (TOST), and assessed for agreement using Bland-Altman analysis. NTCP models based on manual and automatic segmentation were compared. Automatic segmentation was comparable with the manual delineation in radial direction, but larger near the costal cartilage and vertebrae. Manual and automatic Dx and EUD were significantly equivalent. The Bland-Altman analysis showed good agreement. The two NTCP models were very similar. Automatic rib segmentation was significantly equivalent to manual delineation and can be used for NTCP modeling in a large patient group. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Predictive Modeling of Risk Factors and Complications of Cataract Surgery

    PubMed Central

    Gaskin, Gregory L; Pershing, Suzann; Cole, Tyler S; Shah, Nigam H

    2016-01-01

    Purpose To quantify the relationship between aggregated preoperative risk factors and cataract surgery complications, as well as to build a model predicting outcomes on an individual-level—given a constellation of demographic, baseline, preoperative, and intraoperative patient characteristics. Setting Stanford Hospital and Clinics between 1994 and 2013. Design Retrospective cohort study Methods Patients age 40 or older who received cataract surgery between 1994 and 2013. Risk factors, complications, and demographic information were extracted from the Electronic Health Record (EHR), based on International Classification of Diseases, 9th edition (ICD-9) codes, Current Procedural Terminology (CPT) codes, drug prescription information, and text data mining using natural language processing. We used a bootstrapped least absolute shrinkage and selection operator (LASSO) model to identify highly-predictive variables. We built random forest classifiers for each complication to create predictive models. Results Our data corroborated existing literature on postoperative complications—including the association of intraoperative complications, complex cataract surgery, black race, and/or prior eye surgery with an increased risk of any postoperative complications. We also found a number of other, less well-described risk factors, including systemic diabetes mellitus, young age (<60 years old), and hyperopia as risk factors for complex cataract surgery and intra- and post-operative complications. Our predictive models based on aggregated outperformed existing published models. Conclusions The constellations of risk factors and complications described here can guide new avenues of research and provide specific, personalized risk assessment for a patient considering cataract surgery. The predictive capacity of our models can enable risk stratification of patients, which has utility as a teaching tool as well as informing quality/value-based reimbursements. PMID:26692059

  4. Diagnostic accuracy of the postoperative ratio of C-reactive protein to albumin for complications after colorectal surgery.

    PubMed

    Ge, Xiaolong; Cao, Yu; Wang, Hongkan; Ding, Chao; Tian, Hongliang; Zhang, Xueying; Gong, Jianfeng; Zhu, Weiming; Li, Ning

    2017-01-10

    The ratio of C-reactive protein to albumin, as a novel inflammation-based prognostic score, is associated with outcomes in cancer and septic patients. The diagnostic accuracy of the CRP/albumin ratio has not been assessed in colorectal surgery for postoperative complications. A total of 359 patients undergoing major colorectal surgery between 2012 and 2015 were eligible for this study. Uni- and multivariate analyses were performed to identify risk factors for postoperative complications. Receiver operating characteristic curves were developed to examine the cutoff values and diagnostic accuracy of the CRP/albumin ratio and postoperative CRP levels. Among all the patients, 139 (38.7%) were reported to have postoperative complications. The CRP/albumin ratio was an independent risk factor for complications (OR 4.413; 95% CI 2.463-7.906; P < 0.001), and the cutoff value was 2.2, which had a higher area under the curve compared to CRP on postoperative day 3 (AUC 0.779 vs 0.756). The CRP/albumin ratio also had a higher positive predictive value than CRP levels on postoperative day 3. Patients with CRP/albumin ≥2.2 suffered more postoperative complications (60.8% vs 18.6%, P < 0.001), longer postoperative stays (10 (4-71) vs 7 (3-78) days, P < 0.001), and increased surgical site infections (SSIs) (21.1% vs 4.8%, P < 0.001) than those with CRP/albumin <2.2. The ratio of C-reactive protein to albumin could help to identify patients who have a high probability of postoperative complications, and the ratio has higher diagnostic accuracy than C-reactive protein alone for postoperative complications in colorectal surgery.

  5. Atrial septum defect closure device in a beating heart, from the perspective of a researcher in artificial organs.

    PubMed

    Tomizawa, Yasuko

    2012-12-01

    Transcatheter closure of atrial septum defect (ASD) with a closure device is increasing, but the history of clinical use of this procedure is still short, and the efficacy and long-term safety remain unproved. The total number of closure devices implanted throughout the world has not been counted accurately. Therefore, the probability of complications occurring after implantation is uncertain. Device-related complications that occur suddenly late after implantation are life-threatening, and quite often necessitate emergency surgical intervention. In Japanese medical journals, authors reporting closure devices have mentioned no complications and problems in their facilities. Detailed studies of device-related complications and device removal have not been reported in Japan. In fact, this literature search found an unexpectedly large number of reports of various adverse events from many overseas countries. When follow-up duration is short and the number of patients is small, the incidence of complications cannot be determined. Rare complications may emerge in a large series with a long observation period. Consequently, the actual number of incidents related to ASD closure devices is possibly several times higher than the number reported. Guidelines for long-term patient management for patients with an implanted closure device are necessary and post-marketing surveillance is appropriate. Development of a national database, a worldwide registration system, and continuous information disclosure will improve the quality of treatment. The devices currently available are not ideal in view of reports of late complications requiring urgent surgery and the need for life-long follow-up. An ideal device should be free from complications during life, and reliability is indispensable.

  6. When good operations go bad: The additive effect of comorbidity and postoperative complications on readmission after pulmonary lobectomy.

    PubMed

    Jean, Raymond A; Chiu, Alexander S; Boffa, Daniel J; Detterbeck, Frank C; Blasberg, Justin D; Kim, Anthony W

    2018-05-22

    Hospital readmission after major thoracic surgery has a marked effect on health care delivery, particularly in the era of value-based reimbursement. We sought to investigate the additive impact of comorbidity and postoperative complications on the risk of readmission after thoracic lobectomy. We queried the Nationwide Readmission Database of the Healthcare Cost and Utilization Project between 2010 and 2014 for discharges after pulmonary lobectomy with a primary diagnosis of lung cancer. We compared 90-day all-cause readmission rates across the presence of Elixhauser comorbidities and postoperative complications. Adjusted logistic and linear regression, accounting for patient and hospital factors were used to calculate the mean change in readmission rate by the number of comorbidities and postoperative complications. A total of 87,894 patients undergoing pulmonary lobectomies were identified during the study period, of whom 15,858 (18.0%) were readmitted for any cause within 90 days of discharge. After adjusting for other factors, each additional comorbidity and postoperative complication were associated with a 2.0% and 2.7% increased probability of readmission, respectively (both P < .0001). Patients with a low burden of low comorbidities were readmitted more frequently for postoperative complications, while those with a high burden of comorbidities were readmitted more frequently for chronic disease. Among patients with the lowest risk profile, there was an 11.7% readmission rate. Adjusting for other factors, each additional comorbidity and complication increased this rate by approximately 2.0% and 2.7%, respectively. These results demonstrate that the avoidance of postoperative complications may represent an effective mechanism for decreasing readmissions after thoracic surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Antenatal depressive symptoms and perinatal complications: a prospective study in rural Ethiopia.

    PubMed

    Bitew, Tesera; Hanlon, Charlotte; Kebede, Eskinder; Honikman, Simone; Fekadu, Abebaw

    2017-08-22

    Antenatal depressive symptoms affect around 12.3% of women in in low and middle income countries (LMICs) and data are accumulating about associations with adverse outcomes for mother and child. Studies from rural, low-income country community samples are limited. This paper aims to investigate whether antenatal depressive symptoms predict perinatal complications in a rural Ethiopia setting. A population-based prospective study was conducted in Sodo district, southern Ethiopia. A total of 1240 women recruited in the second and third trimesters of pregnancy were followed up until 4 to 12 weeks postpartum. Antenatal depressive symptoms were assessed using a locally validated version of the Patient Health Questionnaire (PHQ-9) that at a cut-off score of five or more indicates probable depression. Self-report of perinatal complications, categorised as maternal and neonatal were collected by using structured interviewer administered questionnaires at a median of eight weeks post-partum. Multivariate analysis was conducted to examine the association between antenatal depressive symptoms and self-reported perinatal complications. A total of 28.7% of women had antenatal depressive symptoms (PHQ-9 score ≥ 5). Women with antenatal depressive symptoms had more than twice the odds of self-reported complications in pregnancy (OR=2.44, 95% CI: 1.84, 3.23), labour (OR= 1.84 95% CI: 1.34, 2.53) and the postpartum period (OR=1.70, 95% CI: 1.23, 2.35) compared to women without these symptoms. There was no association between antenatal depressive symptoms and pregnancy loss or neonatal death. Antenatal depressive symptoms are associated prospectively with self-reports of perinatal complications. Further research is necessary to further confirm these findings in a rural and poor context using objective measures of complications and investigating whether early detection and treatment of depressive symptoms reduces these complications.

  8. Impact of geometric uncertainties on dose calculations for intensity modulated radiation therapy of prostate cancer

    NASA Astrophysics Data System (ADS)

    Jiang, Runqing

    Intensity-modulated radiation therapy (IMRT) uses non-uniform beam intensities within a radiation field to provide patient-specific dose shaping, resulting in a dose distribution that conforms tightly to the planning target volume (PTV). Unavoidable geometric uncertainty arising from patient repositioning and internal organ motion can lead to lower conformality index (CI) during treatment delivery, a decrease in tumor control probability (TCP) and an increase in normal tissue complication probability (NTCP). The CI of the IMRT plan depends heavily on steep dose gradients between the PTV and organ at risk (OAR). Geometric uncertainties reduce the planned dose gradients and result in a less steep or "blurred" dose gradient. The blurred dose gradients can be maximized by constraining the dose objective function in the static IMRT plan or by reducing geometric uncertainty during treatment with corrective verification imaging. Internal organ motion and setup error were evaluated simultaneously for 118 individual patients with implanted fiducials and MV electronic portal imaging (EPI). A Gaussian probability density function (PDF) is reasonable for modeling geometric uncertainties as indicated by the 118 patients group. The Gaussian PDF is patient specific and group standard deviation (SD) should not be used for accurate treatment planning for individual patients. In addition, individual SD should not be determined or predicted from small imaging samples because of random nature of the fluctuations. Frequent verification imaging should be employed in situations where geometric uncertainties are expected. Cumulative PDF data can be used for re-planning to assess accuracy of delivered dose. Group data is useful for determining worst case discrepancy between planned and delivered dose. The margins for the PTV should ideally represent true geometric uncertainties. The measured geometric uncertainties were used in this thesis to assess PTV coverage, dose to OAR, equivalent uniform dose per fraction (EUDf) and NTCP. The dose distribution including geometric uncertainties was determined from integration of the convolution of the static dose gradient with the PDF. Integration of the convolution of the static dose and derivative of the PDF can also be used to determine the dose including geometric uncertainties although this method was not investigated in detail. Local maximum dose gradient (LMDG) was determined via optimization of dose objective function by manually adjusting DVH control points or selecting beam numbers and directions during IMRT treatment planning. Minimum SD (SDmin) is used when geometric uncertainty is corrected with verification imaging. Maximum SD (SDmax) is used when the geometric uncertainty is known to be large and difficult to manage. SDmax was 4.38 mm in anterior-posterior (AP) direction, 2.70 mm in left-right (LR) direction and 4.35 mm in superior-inferior (SI) direction; SDmin was 1.1 mm in all three directions if less than 2 mm threshold was used for uncorrected fractions in every direction. EUDf is a useful QA parameter for interpreting the biological impact of geometric uncertainties on the static dose distribution. The EUD f has been used as the basis for the time-course NTCP evaluation in the thesis. Relative NTCP values are useful for comparative QA checking by normalizing known complications (e.g. reported in the RTOG studies) to specific DVH control points. For prostate cancer patients, rectal complications were evaluated from specific RTOG clinical trials and detailed evaluation of the treatment techniques (e.g. dose prescription, DVH, number of beams, bean angles). Treatment plans that did not meet DVH constraints represented additional complication risk. Geometric uncertainties improved or worsened rectal NTCP depending on individual internal organ motion within patient.

  9. Gastrobronchial fistula following minimally invasive esophagectomy for esophageal cancer in a patient with myotonic dystrophy: Case report

    PubMed Central

    Hugin, Silje; Johnson, Egil; Johannessen, Hans-Olaf; Hofstad, Bjørn; Olafsen, Kjell; Mellem, Harald

    2015-01-01

    Introduction Myotonic dystrophies are inherited multisystemic diseases characterized by musculopathy, cardiac arrythmias and cognitive disorders. These patients are at increased risk for fatal post-surgical complications from pulmonary hypoventilation. We present a case with myotonic dystrophy and esophageal cancer who had a minimally invasive esophagectomy complicated with gastrobronchial fistulisation. Presentation of case A 44-year-old male with myotonic dystrophy type 1 and esophageal cancer had a minimally invasive esophagectomy performed instead of open surgery in order to reduce the risk for pulmonary complications. At day 15 respiratory failure occurred from a gastrobronchial fistula between the right intermediary bronchus (defect 7–8 mm) and the esophagogastric anastomosis (defect 10 mm). In order to minimize large leakage of air into the gastric conduit the anastomosis was stented and ventilation maintained at low airway pressures. His general condition improved and allowed extubation at day 29 and stent removal at day 35. Bronchoscopy confirmed that the fistula was healed. The patient was discharged from hospital at day 37 without further complications. Discussion The fistula was probably caused by bronchial necrosis from thermal injury during close dissection using the Ligasure instrument. Fistula treatment by non-surgical intervention was considered safer than surgery which could be followed by potentially life-threatening respiratory complications. Indications for stenting of gastrobronchial fistulas will be discussed. Conclusions Minimally invasive esophagectomy was performed instead of open surgery in a myotonic dystrophy patient as these patients are particularly vulnerable to respiratory complications. Gastrobronchial fistula, a major complication, was safely treated by stenting and low airway pressure ventilation. PMID:26520033

  10. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  11. Effect of Chiranjeevi Yojana on institutional deliveries and neonatal and maternal outcomes in Gujarat, India: a difference-in-differences analysis.

    PubMed

    Mohanan, Manoj; Bauhoff, Sebastian; La Forgia, Gerard; Babiarz, Kimberly Singer; Singh, Kultar; Miller, Grant

    2014-03-01

    To evaluate the effect of the Chiranjeevi Yojana programme, a public-private partnership to improve maternal and neonatal health in Gujarat, India. A household survey (n = 5597 households) was conducted in Gujarat to collect retrospective data on births within the preceding 5 years. In an observational study using a difference-in-differences design, the relationship between the Chiranjeevi Yojana programme and the probability of delivery in health-care institutions, the probability of obstetric complications and mean household expenditure for deliveries was subsequently examined. In multivariate regressions, individual and household characteristics as well as district and year fixed effects were controlled for. Data from the most recent District Level Household and Facility Survey (DLHS-3) wave conducted in Gujarat (n = 6484 households) were used in parallel analyses. Between 2005 and 2010, the Chiranjeevi Yojana programme was not associated with a statistically significant change in the probability of institutional delivery (2.42 percentage points; 95% confidence interval, CI: -5.90 to 10.74) or of birth-related complications (6.16 percentage points; 95% CI: -2.63 to 14.95). Estimates using DLHS-3 data were similar. Analyses of household expenditures indicated that mean household expenditure for private-sector deliveries had either not fallen or had fallen very little under the Chiranjeevi Yojana programme. The Chiranjeevi Yojana programme appears to have had no significant impact on institutional delivery rates or maternal health outcomes. The absence of estimated reductions in household spending for private-sector deliveries deserves further study.

  12. Models based on value and probability in health improve shared decision making.

    PubMed

    Ortendahl, Monica

    2008-10-01

    Diagnostic reasoning and treatment decisions are a key competence of doctors. A model based on values and probability provides a conceptual framework for clinical judgments and decisions, and also facilitates the integration of clinical and biomedical knowledge into a diagnostic decision. Both value and probability are usually estimated values in clinical decision making. Therefore, model assumptions and parameter estimates should be continually assessed against data, and models should be revised accordingly. Introducing parameter estimates for both value and probability, which usually pertain in clinical work, gives the model labelled subjective expected utility. Estimated values and probabilities are involved sequentially for every step in the decision-making process. Introducing decision-analytic modelling gives a more complete picture of variables that influence the decisions carried out by the doctor and the patient. A model revised for perceived values and probabilities by both the doctor and the patient could be used as a tool for engaging in a mutual and shared decision-making process in clinical work.

  13. Discovering Diabetes Complications: an Ontology Based Model.

    PubMed

    Daghistani, Tahani; Shammari, Riyad Al; Razzak, Muhammad Imran

    2015-12-01

    Diabetes is a serious disease that spread in the world dramatically. The diabetes patient has an average of risk to experience complications. Take advantage of recorded information to build ontology as information technology solution will help to predict patients who have average of risk level with certain complication. It is helpful to search and present patient's history regarding different risk factors. Discovering diabetes complications could be useful to prevent or delay the complications. We designed ontology based model, using adult diabetes patients' data, to discover the rules of diabetes with its complications in disease to disease relationship. Various rules between different risk factors of diabetes Patients and certain complications generated. Furthermore, new complications (diseases) might be discovered as new finding of this study, discovering diabetes complications could be useful to prevent or delay the complications. The system can identify the patients who are suffering from certain risk factors such as high body mass index (obesity) and starting controlling and maintaining plan.

  14. Discovering Diabetes Complications: an Ontology Based Model

    PubMed Central

    Daghistani, Tahani; Shammari, Riyad Al; Razzak, Muhammad Imran

    2015-01-01

    Background: Diabetes is a serious disease that spread in the world dramatically. The diabetes patient has an average of risk to experience complications. Take advantage of recorded information to build ontology as information technology solution will help to predict patients who have average of risk level with certain complication. It is helpful to search and present patient’s history regarding different risk factors. Discovering diabetes complications could be useful to prevent or delay the complications. Method: We designed ontology based model, using adult diabetes patients’ data, to discover the rules of diabetes with its complications in disease to disease relationship. Result: Various rules between different risk factors of diabetes Patients and certain complications generated. Furthermore, new complications (diseases) might be discovered as new finding of this study, discovering diabetes complications could be useful to prevent or delay the complications. Conclusion: The system can identify the patients who are suffering from certain risk factors such as high body mass index (obesity) and starting controlling and maintaining plan. PMID:26862251

  15. Methods to assess performance of models estimating risk of death in intensive care patients: a review.

    PubMed

    Cook, D A

    2006-04-01

    Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.

  16. Participatory Games: Experiential learning to bridge disciplines

    NASA Astrophysics Data System (ADS)

    Coughlan, E.; Suarez, P.; Mendler de Suarez, J.; Bachofen, C.

    2014-12-01

    While the benefits of multi-disciplinary education have been extolled, there is more to success than producing students who are able to articulate the theorems of all pertinent disciplines. Here, we will describe case studies in which participatory scenario exercises and games can make the difference between memorizing information from an "outside" discipline, and actually internalizing the priorities and complications of the issue from an alien perspective. Case studies include teaching Red Cross community-based volunteers the Probability Distribution Function of seasonal rainfall forecasts, as well as requiring students of Columbia University's Master's Program in Climate and Society to study both natural and social aspects of climate. Games create a model system of the world, in which players assume a role and make decisions with consequences, facing complex feedback loops. Taking such roles catalyzes "AHA" moments that effectively bring home the intricacies of disciplinary paradigms outside of one's own.

  17. CT guided interstitial therapy of pancreatic carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haaga, J.R.; Owens, D.B.; Kellermeyer, R.W.

    1987-11-01

    We describe the use of percutaneous CT guidance for localization and placement of /sup 192/Ir sources into a patient with pancreatic carcinoma. We have shown the feasibility of this procedure and the lack of complications which are probably due to minimal damage to tissue involved. Computed tomography is ideally suited for percutaneous implantation because it provides the most accurate method for needle placement within the abdomen.

  18. Classical-Quantum Correspondence by Means of Probability Densities

    NASA Technical Reports Server (NTRS)

    Vegas, Gabino Torres; Morales-Guzman, J. D.

    1996-01-01

    Within the frame of the recently introduced phase space representation of non relativistic quantum mechanics, we propose a Lagrangian from which the phase space Schrodinger equation can be derived. From that Lagrangian, the associated conservation equations, according to Noether's theorem, are obtained. This shows that one can analyze quantum systems completely in phase space as it is done in coordinate space, without additional complications.

  19. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  20. Sri Lankan FRAX model and country-specific intervention thresholds.

    PubMed

    Lekamwasam, Sarath

    2013-01-01

    There is a wide variation in fracture probabilities estimated by Asian FRAX models, although the outputs of South Asian models are concordant. Clinicians can choose either fixed or age-specific intervention thresholds when making treatment decisions in postmenopausal women. Cost-effectiveness of such approach, however, needs to be addressed. This study examined suitable fracture probability intervention thresholds (ITs) for Sri Lanka, based on the Sri Lankan FRAX model. Fracture probabilities were estimated using all Asian FRAX models for a postmenopausal woman of BMI 25 kg/m² and has no clinical risk factors apart from a fragility fracture, and they were compared. Age-specific ITs were estimated based on the Sri Lankan FRAX model using the method followed by the National Osteoporosis Guideline Group in the UK. Using the age-specific ITs as the reference standard, suitable fixed ITs were also estimated. Fracture probabilities estimated by different Asian FRAX models varied widely. Japanese and Taiwan models showed higher fracture probabilities while Chinese, Philippine, and Indonesian models gave lower fracture probabilities. Output of remaining FRAX models were generally similar. Age-specific ITs of major osteoporotic fracture probabilities (MOFP) based on the Sri Lankan FRAX model varied from 2.6 to 18% between 50 and 90 years. ITs of hip fracture probabilities (HFP) varied from 0.4 to 6.5% between 50 and 90 years. In finding fixed ITs, MOFP of 11% and HFP of 3.5% gave the lowest misclassification and highest agreement. Sri Lankan FRAX model behaves similar to other Asian FRAX models such as Indian, Singapore-Indian, Thai, and South Korean. Clinicians may use either the fixed or age-specific ITs in making therapeutic decisions in postmenopausal women. The economical aspects of such decisions, however, need to be considered.

  1. Delayed complications of sulfur mustard poisoning in the skin and the immune system of Iranian veterans 16-20 years after exposure.

    PubMed

    Hefazi, Mehrdad; Maleki, Masoud; Mahmoudi, Mahmoud; Tabatabaee, Abbas; Balali-Mood, Mahdi

    2006-09-01

    Extensive cutaneous burns caused by alkylating chemical warfare agent sulfur mustard (SM) have been associated with the severe suppression of the immune system in humans. We aimed to study the association between late cutaneous and immunological complications of SM poisoning. Skin examination was performed on all SM-poisoned Iranian veterans in the province of Khorasan, Iran, who had significant clinical complications, and their SM intoxication was confirmed by toxicological analysis. Light microscopy was performed on eight skin biopsies. Blood cell counts, serum immunoglobulin and complement factor, as well as flow cytometric, analyses were performed on all the patients. The severity of cutaneous complications were classified into four grades and compared with hematological and immunological parameters, using Spearman's rank correlation test. Forty male subjects, confirmed with SM poisoning 16-20 years earlier, were studied. The main objective findings were hyperpigmentation (55%), dry skin (40%), multiple cherry angiomas (37.5%), atrophy (27.5%), and hypopigmentation (25%). Histopathologic findings were nonspecific and compatible with hyperpigmented old atrophic scars. Except for the hematocrit and C4 levels, hematological and immunological parameters revealed no significant correlation with the severity grades of cutaneous complications. Sulfur mustard is an alkylating agent with prolonged adverse effects on both the skin and the immune system. Although skin is a major transporting system for SM's systemic absorption, there is probably no correlation between the severity of late cutaneous and immunological complications of SM poisoning.

  2. A Young Child with Eosinophilia, Rash, and Multisystem Illness: Drug Rash, Eosinophilia, and Systemic Symptoms Syndrome After Receipt of Fluoxetine.

    PubMed

    Vignesh, Pandiarajan; Kishore, Janak; Kumar, Ankur; Vinay, Keshavamurthy; Dogra, Sunil; Sreedharanunni, Sreejesh; Prasun Giri, Prabhas; Pal, Priyankar; Ghosh, Apurba

    2017-05-01

    Drug rash, eosinophilia, and systemic symptoms (DRESS) syndrome is a severe systemic hypersensitivity reaction that usually occurs within 6 weeks of exposure to the offending drug. Diagnosis is usually straightforward in patients with pyrexia, skin rash, hepatitis, and eosinophilia with a preceding history of exposure to agents often associated with DRESS syndrome, such as aromatic anticonvulsants and sulfa drugs, but diagnosis of DRESS may still be a challenge. We report a 4-year-old child with probable DRESS syndrome complicated by multiple hematologic complications that developed 1 month after exposure to fluoxetine, a drug not known to be associated with such severe reactions. © 2017 Wiley Periodicals, Inc.

  3. Gut failure in critical care: old school versus new school

    PubMed Central

    Sertaridou, Eleni; Papaioannou, Vasilios; Kolios, George; Pneumatikos, Ioannis

    2015-01-01

    The concept of bacterial translocation and gut-origin sepsis as causes of systemic infectious complications and multiple organ deficiency syndrome in surgical and critically ill patients has been a recurring issue over the last decades attracting the scientific interest. Although gastrointestinal dysfunction seemingly arises frequently in intensive care unit patients, it is usually underdiagnosed or underestimated, because the pathophysiology involved is incompletely understood and its exact clinical relevance still remains controversial with an unknown yet probably adverse impact on the patients’ outcome. The purpose of this review is to define gut-origin sepsis and related terms, to describe the mechanisms leading to gut-derived complications, and to illustrate the therapeutic options to prevent or limit these untoward processes. PMID:26130136

  4. [Ischemic stroke in childhood. A complication of tonsillectomy].

    PubMed

    Matilla Álvarez, A; García Serrano, E; González de la Huebra Labrador, T; Morales Martín, A C; Yusta Martín, G; Vaquero Roncero, L M

    2016-02-01

    Tonsillectomy is one of the most frequently performed otorhinolaryngological procedures on children. The postoperative complications are classified into primary or intermediate, which generally appear within 24h, and as secondary or delayed, after 48 h. We present the case of an ischemic stroke after performing a tonsillectomy on a 3 year-old boy, which was diagnosed in the immediate postoperative period. Using brain echo-doppler and angio-CT, an intraluminal clot was observed in the left internal carotid artery, probably as a result of direct vessel injury during arterial ligature for hemostasis. Copyright © 2015 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. MAP Fault Localization Based on Wide Area Synchronous Phasor Measurement Information

    NASA Astrophysics Data System (ADS)

    Zhang, Yagang; Wang, Zengping

    2015-02-01

    In the research of complicated electrical engineering, the emergence of phasor measurement units (PMU) is a landmark event. The establishment and application of wide area measurement system (WAMS) in power system has made widespread and profound influence on the safe and stable operation of complicated power system. In this paper, taking full advantage of wide area synchronous phasor measurement information provided by PMUs, we have carried out precise fault localization based on the principles of maximum posteriori probability (MAP). Large numbers of simulation experiments have confirmed that the results of MAP fault localization are accurate and reliable. Even if there are interferences from white Gaussian stochastic noise, the results from MAP classification are also identical to the actual real situation.

  6. Neurological complications of human immunodeficiency virus infection.

    PubMed Central

    Kennedy, P. G.

    1988-01-01

    The protean neurological manifestations of human immunodeficiency virus (HIV) infection are reviewed. Both the central nervous system and peripheral nervous system may be affected and many of the complications may occur in individuals with acquired immunodeficiency syndrome (AIDS)-related complex, or who are seropositive for HIV alone as well as those with the established AIDS syndrome. Specific therapy is available for certain of these neurological conditions, but the clinical course in others is untreatable and progressive. Although it seems likely that the pathogenesis of some of these syndromes such as the AIDS-dementia complex are due to the direct effect of HIV on the nervous system, in others the neurological injury probably occurs as a consequence of the immunosuppression which HIV induces, or immune-mediated mechanisms. PMID:3050940

  7. Reduced rates of non-union with modified periacetabular osteotomy using peracetic-acid sterilized cancellous allografts.

    PubMed

    Wassilew, Georgi I; Janz, Viktor; Renner, Lisa; Perka, Carsten; Pruss, Axel

    2016-12-01

    The objective of the present study was to analyze the clinical and radiological results of periacetabular osteotomies (PAO) using Kirschner wire fixation and an allogeneic cancellous bone graft. This retrospective cohort study included 73 patients (85 PAOs). The allografts were processed from distal femur of cadaveric donors, defatted, sterilized with a peracetic-acid ethanol solution and freeze-dried. The clinical outcome, as measured by the Harris Hip Scores (HHS), the complication rate and the acetabular correction, as measured by radiological parameters, were compared. The postoperative femoral head coverage and HSS were significantly improved. Major complications occurred in five cases (6 %), but in no case did we observe a non-union or a graft-associated adverse effect. Fixation of the acetabular fragment with Kirschner wires in combination with an allogeneic cancellous bone graft is a safe method, with a low complication rate, no loss of correction and can prevent the occurrence of non-union with a high degree of probability.

  8. [Low-dose aspirin in patients with diabete melitus: risks and benefits regarding macro and microvascular complications].

    PubMed

    Camargo, Eduardo G; Gross, Jorge Luiz; Weinert, Letícia S; Lavinsky, Joel; Silveiro, Sandra P

    2007-04-01

    Aspirin is recommended as cardiovascular disease prevention in patients with diabetes mellitus. Due to the increased risk of bleeding and because of the hypothesis that there could be a worsening of microvascular complications related to aspirin, there has been observed an important underutilization of the drug. However, it is now known that aspirin is not associated with a deleterious effect on diabetic retinopathy and there is evidence indicating that it also does not affect renal function with usual doses (150 mg/d). On the other hand, higher doses may prove necessary, since recent data suggest that diabetic patients present the so called "aspirin resistance". The mechanisms of this resistance are not yet fully understood, being probably related to an abnormal intrinsic platelet activity. The employment of alternative antiplatelet strategies or the administration of higher aspirin doses (150-300 mg/d) should be better evaluated regarding effective cardiovascular disease prevention in diabetes as well as the possible effects on microvascular complications.

  9. Examination of the gamma equilibrium point hypothesis when applied to single degree of freedom movements performed with different inertial loads.

    PubMed

    Bellomo, A; Inbar, G

    1997-01-01

    One of the theories of human motor control is the gamma Equilibrium Point Hypothesis. It is an attractive theory since it offers an easy control scheme where the planned trajectory shifts monotionically from an initial to a final equilibrium state. The feasibility of this model was tested by reconstructing the virtual trajectory and the stiffness profiles for movements performed with different inertial loads and examining them. Three types of movements were tested: passive movements, targeted movements, and repetitive movements. Each of the movements was performed with five different inertial loads. Plausible virtual trajectories and stiffness profiles were reconstructed based on the gamma Equilibrium Point Hypothesis for the three different types of movements performed with different inertial loads. However, the simple control strategy supported by the model, where the planned trajectory shifts monotonically from an initial to a final equilibrium state, could not be supported for targeted movements performed with added inertial load. To test the feasibility of the model further we must examine the probability that the human motor control system would choose a trajectory more complicated than the actual trajectory to control.

  10. Smoking, death, and Alzheimer disease: a case of competing risks.

    PubMed

    Chang, Chung-Chou H; Zhao, Yongyun; Lee, Ching-Wen; Ganguli, Mary

    2012-01-01

    If smoking is a risk factor for Alzheimer disease (AD) but a smoker dies of another cause before developing or manifesting AD, smoking-related mortality may mask the relationship between smoking and AD. This phenomenon, referred to as competing risk, complicates efforts to model the effect of smoking on AD. Typical survival regression models assume that censorship from analysis is unrelated to an individual's probability for developing AD (ie, censoring is noninformative). However, if individuals who die before developing AD are younger than those who survive long enough to develop AD, and if they include a higher percentage of smokers than nonsmokers, the incidence of AD will appear to be higher in older individuals and in nonsmokers. Further, age-specific mortality rates are higher in smokers because they die earlier than nonsmokers. Therefore, if we fail to take into account the competing risk of death when we estimate the effect of smoking on AD, we bias the results and are in fact only comparing the incidence of AD in nonsmokers with that in the healthiest smokers. In this study, we demonstrate that the effect of smoking on AD differs in models that are and are not adjusted for competing risks.

  11. Unexpected Complications of Novel Deep Brain Stimulation Treatments: Ethical Issues and Clinical Recommendations

    PubMed Central

    Cheeran, Binith; Pugh, Jonathan; Pycroft, Laurie; Boccard, Sandra; Prangnell, Simon; Green, Alexander L.; FitzGerald, James; Savulescu, Julian; Aziz, Tipu

    2017-01-01

    Background Innovative neurosurgical treatments present a number of known risks, the natures and probabilities of which can be adequately communicated to patients via the standard procedures governing obtaining informed consent. However, due to their novelty, these treatments also come with unknown risks, which require an augmented approach to obtaining informed consent. Objective This paper aims to discuss and provide concrete procedural guidance on the ethical issues raised by serious unexpected complications of novel deep brain stimulation treatments. Approach We illustrate our analysis using a case study of the unexpected development of recurrent stereotyped events in patients following the use of deep brain stimulation (DBS) to treat severe chronic pain. Examining these unexpected complications in light of medical ethical principles, we argue that serious complications of novel DBS treatments do not necessarily make it unethical to offer the intervention to eligible patients. However, the difficulty the clinician faces in determining whether the intervention is in the patient's best interests generates reasons to take extra steps to promote the autonomous decision making of these patients. Conclusion and recommendations We conclude with clinical recommendations, including details of an augmented consent process for novel DBS treatment. PMID:28557242

  12. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    PubMed

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  13. A combined ultrasound and clinical scoring model for the prediction of peripartum complications in pregnancies complicated by placenta previa.

    PubMed

    Yoon, So-Yeon; You, Ji Yeon; Choi, Suk-Joo; Oh, Soo-Young; Kim, Jong-Hwa; Roh, Cheong-Rae

    2014-09-01

    To generate a combined ultrasound and clinical model predictive for peripartum complications in pregnancies complicated by placenta previa. This study included 110 singleton pregnant women with placenta previa delivered by cesarean section (CS) from July 2011 to November 2013. We prospectively collected ultrasound and clinical data before CS and observed the occurrence of blood transfusion, uterine artery embolization and cesarean hysterectomy. We formulated a scoring model including type of previa (0: partials, 2: totalis), lacunae (0: none, 1: 1-3, 2: 4-6, 3: whole), uteroplacental hypervascularity (0: normal, 1: moderate, 2: severe), multiparity (0: no, 1: yes), history of CS (0: none, 1: once, 2: ≥ twice) and history of placenta previa (0: no, 1: yes) to predict the risk of peripartum complications. In our study population, the risk of perioperative transfusion, uterine artery embolization, and cesarean hysterectomy were 26.4, 1.8 and 6.4%, respectively. The type of previa, lacunae, uteroplacental hypervascularity, parity, history of CS, and history of placenta previa were associated with complications in univariable analysis. However, no factor was independently predictive for any complication in exact logistic regression analysis. Using the scoring model, we found that total score significantly correlated with perioperative transfusion, cesarean hysterectomy and composite complication (p<0.0001, Cochrane Armitage test). Notably, all patients with total score ≥7 needed cesarean hysterectomy. When total score was ≥6, three fourths of patients needed blood transfusion. This combined scoring model may provide useful information for prediction of peripartum complications in women with placenta previa. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Prediction Models for 30-Day Mortality and Complications After Total Knee and Hip Arthroplasties for Veteran Health Administration Patients With Osteoarthritis.

    PubMed

    Harris, Alex Hs; Kuo, Alfred C; Bowe, Thomas; Gupta, Shalini; Nordin, David; Giori, Nicholas J

    2018-05-01

    Statistical models to preoperatively predict patients' risk of death and major complications after total joint arthroplasty (TJA) could improve the quality of preoperative management and informed consent. Although risk models for TJA exist, they have limitations including poor transparency and/or unknown or poor performance. Thus, it is currently impossible to know how well currently available models predict short-term complications after TJA, or if newly developed models are more accurate. We sought to develop and conduct cross-validation of predictive risk models, and report details and performance metrics as benchmarks. Over 90 preoperative variables were used as candidate predictors of death and major complications within 30 days for Veterans Health Administration patients with osteoarthritis who underwent TJA. Data were split into 3 samples-for selection of model tuning parameters, model development, and cross-validation. C-indexes (discrimination) and calibration plots were produced. A total of 70,569 patients diagnosed with osteoarthritis who received primary TJA were included. C-statistics and bootstrapped confidence intervals for the cross-validation of the boosted regression models were highest for cardiac complications (0.75; 0.71-0.79) and 30-day mortality (0.73; 0.66-0.79) and lowest for deep vein thrombosis (0.59; 0.55-0.64) and return to the operating room (0.60; 0.57-0.63). Moderately accurate predictive models of 30-day mortality and cardiac complications after TJA in Veterans Health Administration patients were developed and internally cross-validated. By reporting model coefficients and performance metrics, other model developers can test these models on new samples and have a procedure and indication-specific benchmark to surpass. Published by Elsevier Inc.

  15. Probability models for growth and aflatoxin B1 production as affected by intraspecies variability in Aspergillus flavus.

    PubMed

    Aldars-García, Laila; Berman, María; Ortiz, Jordi; Ramos, Antonio J; Marín, Sonia

    2018-06-01

    The probability of growth and aflatoxin B 1 (AFB 1 ) production of 20 isolates of Aspergillus flavus were studied using a full factorial design with eight water activity levels (0.84-0.98 a w ) and six temperature levels (15-40 °C). Binary data obtained from growth studies were modelled using linear logistic regression analysis as a function of temperature, water activity and time for each isolate. In parallel, AFB 1 was extracted at different times from newly formed colonies (up to 20 mm in diameter). Although a total of 950 AFB 1 values over time for all conditions studied were recorded, they were not considered to be enough to build probability models over time, and therefore, only models at 30 days were built. The confidence intervals of the regression coefficients of the probability of growth models showed some differences among the 20 growth models. Further, to assess the growth/no growth and AFB 1 /no- AFB 1 production boundaries, 0.05 and 0.5 probabilities were plotted at 30 days for all of the isolates. The boundaries for growth and AFB 1 showed that, in general, the conditions for growth were wider than those for AFB 1 production. The probability of growth and AFB 1 production seemed to be less variable among isolates than AFB 1 accumulation. Apart from the AFB 1 production probability models, using growth probability models for AFB 1 probability predictions could be, although conservative, a suitable alternative. Predictive mycology should include a number of isolates to generate data to build predictive models and take into account the genetic diversity of the species and thus make predictions as similar as possible to real fungal food contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Comparison of the Mortality Probability Admission Model III, National Quality Forum, and Acute Physiology and Chronic Health Evaluation IV hospital mortality models: implications for national benchmarking*.

    PubMed

    Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E

    2014-03-01

    To examine the accuracy of the original Mortality Probability Admission Model III, ICU Outcomes Model/National Quality Forum modification of Mortality Probability Admission Model III, and Acute Physiology and Chronic Health Evaluation IVa models for comparing observed and risk-adjusted hospital mortality predictions. Retrospective paired analyses of day 1 hospital mortality predictions using three prognostic models. Fifty-five ICUs at 38 U.S. hospitals from January 2008 to December 2012. Among 174,001 intensive care admissions, 109,926 met model inclusion criteria and 55,304 had data for mortality prediction using all three models. None. We compared patient exclusions and the discrimination, calibration, and accuracy for each model. Acute Physiology and Chronic Health Evaluation IVa excluded 10.7% of all patients, ICU Outcomes Model/National Quality Forum 20.1%, and Mortality Probability Admission Model III 24.1%. Discrimination of Acute Physiology and Chronic Health Evaluation IVa was superior with area under receiver operating curve (0.88) compared with Mortality Probability Admission Model III (0.81) and ICU Outcomes Model/National Quality Forum (0.80). Acute Physiology and Chronic Health Evaluation IVa was better calibrated (lowest Hosmer-Lemeshow statistic). The accuracy of Acute Physiology and Chronic Health Evaluation IVa was superior (adjusted Brier score = 31.0%) to that for Mortality Probability Admission Model III (16.1%) and ICU Outcomes Model/National Quality Forum (17.8%). Compared with observed mortality, Acute Physiology and Chronic Health Evaluation IVa overpredicted mortality by 1.5% and Mortality Probability Admission Model III by 3.1%; ICU Outcomes Model/National Quality Forum underpredicted mortality by 1.2%. Calibration curves showed that Acute Physiology and Chronic Health Evaluation performed well over the entire risk range, unlike the Mortality Probability Admission Model and ICU Outcomes Model/National Quality Forum models. Acute Physiology and Chronic Health Evaluation IVa had better accuracy within patient subgroups and for specific admission diagnoses. Acute Physiology and Chronic Health Evaluation IVa offered the best discrimination and calibration on a large common dataset and excluded fewer patients than Mortality Probability Admission Model III or ICU Outcomes Model/National Quality Forum. The choice of ICU performance benchmarks should be based on a comparison of model accuracy using data for identical patients.

  17. Time‐dependent renewal‐model probabilities when date of last earthquake is unknown

    USGS Publications Warehouse

    Field, Edward H.; Jordan, Thomas H.

    2015-01-01

    We derive time-dependent, renewal-model earthquake probabilities for the case in which the date of the last event is completely unknown, and compare these with the time-independent Poisson probabilities that are customarily used as an approximation in this situation. For typical parameter values, the renewal-model probabilities exceed Poisson results by more than 10% when the forecast duration exceeds ~20% of the mean recurrence interval. We also derive probabilities for the case in which the last event is further constrained to have occurred before historical record keeping began (the historic open interval), which can only serve to increase earthquake probabilities for typically applied renewal models.We conclude that accounting for the historic open interval can improve long-term earthquake rupture forecasts for California and elsewhere.

  18. The anticipation and management of air leaks and residual spaces post lung resection

    PubMed Central

    Marzluf, Beatrice A.

    2014-01-01

    The incidence of any kind of air leaks after lung resections is reportedly around 50% of patients. The majority of these leaks doesn’t require any specific intervention and ceases within a few hours or days. The recent literature defines a prolonged air leak (PAL) as an air leak lasting beyond postoperative day 5. PAL is associated with a generally worse outcome with a more complicated postoperative course anxd prolonged hospital stay and increased costs. Some authors therefore consider any PAL as surgical complication. PAL is the most prevalent postoperative complication following lung resection and the most important determinant of postoperative length of hospital stay. A low predicted postoperative forced expiratory volume in 1 second (ppoFEV1) and upper lobe disease have been identified as significant risk factors involved in developing air leaks. Infectious conditions have also been reported to increase the risk of PAL. In contrast to the problem of PAL, there is only limited information from the literature regarding apical spaces after lung resection, probably because this common finding rarely leads to clinical consequences. This article addresses the pathogenesis of PAL and apical spaces, their prediction, prevention and treatment with a special focus on surgery for infectious conditions. Different predictive models to identify patients at higher risk for the development of PAL are provided. The discussion of surgical treatment options includes the use of pneumoperitoneum, blood patch, intrabronchial valves (IBV) and the flutter valve, and addresses the old question, whether or not to apply suction to chest tubes. The discussed prophylactic armentarium comprises of pleural tenting, prophylactic intraoperative pneumoperitoneum, sealing of the lung, buttressing of staple lines, capitonnage after resection of hydatid cysts, and plastic surgical options. PMID:24624291

  19. Multi-institutional analysis of radiation modality use and postoperative outcomes of neoadjuvant chemoradiation for esophageal cancer.

    PubMed

    Lin, Steven H; Merrell, Kenneth W; Shen, Jincheng; Verma, Vivek; Correa, Arlene M; Wang, Lu; Thall, Peter F; Bhooshan, Neha; James, Sarah E; Haddock, Michael G; Suntharalingam, Mohan; Mehta, Minesh P; Liao, Zhongxing; Cox, James D; Komaki, Ritsuko; Mehran, Reza J; Chuong, Michael D; Hallemeier, Christopher L

    2017-06-01

    Relative radiation dose exposure to vital organs in the thorax could influence clinical outcomes in esophageal cancer (EC). We assessed whether the type of radiation therapy (RT) modality used was associated with postoperative outcomes after neoadjuvant chemoradiation (nCRT). Contemporary data from 580 EC patients treated with nCRT at 3 academic institutions from 2007 to 2013 were reviewed. 3D conformal RT (3D), intensity modulated RT (IMRT) and proton beam therapy (PBT) were used for 214 (37%), 255 (44%), and 111 (19%) patients, respectively. Postoperative outcomes included pulmonary, GI, cardiac, wound healing complications, length of in-hospital stay (LOS), and 90-day postoperative mortality. Cox model fits, and log-rank tests both with and without Inverse Probability of treatment Weighting (IPW) were used to correct for bias due to non-randomization. RT modality was significantly associated with the incidence of pulmonary, cardiac and wound complications, which also bore out on multivariate analysis. Mean LOS was also significantly associated with treatment modality (13.2days for 3D (95%CI 11.7-14.7), 11.6days for IMRT (95%CI 10.9-12.7), and 9.3days for PBT (95%CI 8.2-10.3) (p<0.0001)). The 90day postoperative mortality rates were 4.2%, 4.3%, and 0.9%, respectively, for 3D, IMRT and PBT (p=0.264). Advanced RT technologies (IMRT and PBT) were associated with significantly reduced rate of postoperative complications and LOS compared to 3D, with PBT displaying the greatest benefit in a number of clinical endpoints. Ongoing prospective randomized trial will be needed to validate these results. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Scale-invariant structure of energy fluctuations in real earthquakes

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Chang, Zhe; Wang, Huanyu; Lu, Hong

    2017-11-01

    Earthquakes are obviously complex phenomena associated with complicated spatiotemporal correlations, and they are generally characterized by two power laws: the Gutenberg-Richter (GR) and the Omori-Utsu laws. However, an important challenge has been to explain two apparently contrasting features: the GR and Omori-Utsu laws are scale-invariant and unaffected by energy or time scales, whereas earthquakes occasionally exhibit a characteristic energy or time scale, such as with asperity events. In this paper, three high-quality datasets on earthquakes were used to calculate the earthquake energy fluctuations at various spatiotemporal scales, and the results reveal the correlations between seismic events regardless of their critical or characteristic features. The probability density functions (PDFs) of the fluctuations exhibit evidence of another scaling that behaves as a q-Gaussian rather than random process. The scaling behaviors are observed for scales spanning three orders of magnitude. Considering the spatial heterogeneities in a real earthquake fault, we propose an inhomogeneous Olami-Feder-Christensen (OFC) model to describe the statistical properties of real earthquakes. The numerical simulations show that the inhomogeneous OFC model shares the same statistical properties with real earthquakes.

  1. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  2. Community resilience and decision theory challenges for catastrophic events.

    PubMed

    Cox, Louis Anthony

    2012-11-01

    Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd-following, group-think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard-to-envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building "disaster-resilient" communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events. © 2012 Society for Risk Analysis.

  3. A spatially explicit model for an Allee effect: why wolves recolonize so slowly in Greater Yellowstone.

    PubMed

    Hurford, Amy; Hebblewhite, Mark; Lewis, Mark A

    2006-11-01

    A reduced probability of finding mates at low densities is a frequently hypothesized mechanism for a component Allee effect. At low densities dispersers are less likely to find mates and establish new breeding units. However, many mathematical models for an Allee effect do not make a distinction between breeding group establishment and subsequent population growth. Our objective is to derive a spatially explicit mathematical model, where dispersers have a reduced probability of finding mates at low densities, and parameterize the model for wolf recolonization in the Greater Yellowstone Ecosystem (GYE). In this model, only the probability of establishing new breeding units is influenced by the reduced probability of finding mates at low densities. We analytically and numerically solve the model to determine the effect of a decreased probability in finding mates at low densities on population spread rate and density. Our results suggest that a reduced probability of finding mates at low densities may slow recolonization rate.

  4. An interval chance-constrained fuzzy modeling approach for supporting land-use planning and eco-environment planning at a watershed level.

    PubMed

    Ou, Guoliang; Tan, Shukui; Zhou, Min; Lu, Shasha; Tao, Yinghui; Zhang, Zuo; Zhang, Lu; Yan, Danping; Guan, Xingliang; Wu, Gang

    2017-12-15

    An interval chance-constrained fuzzy land-use allocation (ICCF-LUA) model is proposed in this study to support solving land resource management problem associated with various environmental and ecological constraints at a watershed level. The ICCF-LUA model is based on the ICCF (interval chance-constrained fuzzy) model which is coupled with interval mathematical model, chance-constrained programming model and fuzzy linear programming model and can be used to deal with uncertainties expressed as intervals, probabilities and fuzzy sets. Therefore, the ICCF-LUA model can reflect the tradeoff between decision makers and land stakeholders, the tradeoff between the economical benefits and eco-environmental demands. The ICCF-LUA model has been applied to the land-use allocation of Wujiang watershed, Guizhou Province, China. The results indicate that under highly land suitable conditions, optimized area of cultivated land, forest land, grass land, construction land, water land, unused land and landfill in Wujiang watershed will be [5015, 5648] hm 2 , [7841, 7965] hm 2 , [1980, 2056] hm 2 , [914, 1423] hm 2 , [70, 90] hm 2 , [50, 70] hm 2 and [3.2, 4.3] hm 2 , the corresponding system economic benefit will be between 6831 and 7219 billion yuan. Consequently, the ICCF-LUA model can effectively support optimized land-use allocation problem in various complicated conditions which include uncertainties, risks, economic objective and eco-environmental constraints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Factors influencing reporting and harvest probabilities in North American geese

    USGS Publications Warehouse

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  6. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  7. Probability Modeling and Thinking: What Can We Learn from Practice?

    ERIC Educational Resources Information Center

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  8. A simplified model for the assessment of the impact probability of fragments.

    PubMed

    Gubinelli, Gianfilippo; Zanelli, Severino; Cozzani, Valerio

    2004-12-31

    A model was developed for the assessment of fragment impact probability on a target vessel, following the collapse and fragmentation of a primary vessel due to internal pressure. The model provides the probability of impact of a fragment with defined shape, mass and initial velocity on a target of a known shape and at a given position with respect to the source point. The model is based on the ballistic analysis of the fragment trajectory and on the determination of impact probabilities by the analysis of initial direction of fragment flight. The model was validated using available literature data.

  9. The effect of microscopic friction and size distributions on conditional probability distributions in soft particle packings

    NASA Astrophysics Data System (ADS)

    Saitoh, Kuniyasu; Magnanimo, Vanessa; Luding, Stefan

    2017-10-01

    Employing two-dimensional molecular dynamics (MD) simulations of soft particles, we study their non-affine responses to quasi-static isotropic compression where the effects of microscopic friction between the particles in contact and particle size distributions are examined. To quantify complicated restructuring of force-chain networks under isotropic compression, we introduce the conditional probability distributions (CPDs) of particle overlaps such that a master equation for distribution of overlaps in the soft particle packings can be constructed. From our MD simulations, we observe that the CPDs are well described by q-Gaussian distributions, where we find that the correlation for the evolution of particle overlaps is suppressed by microscopic friction, while it significantly increases with the increase of poly-dispersity.

  10. Viterbi sparse spike detection and a compositional origin to ultralow-velocity zones

    NASA Astrophysics Data System (ADS)

    Brown, Samuel Paul

    Accurate interpretation of seismic travel times and amplitudes in both the exploration and global scales is complicated by the band-limited nature of seismic data. We present a stochastic method, Viterbi sparse spike detection (VSSD), to reduce a seismic waveform into a most probable constituent spike train. Model waveforms are constructed from a set of candidate spike trains convolved with a source wavelet estimate. For each model waveform, a profile hidden Markov model (HMM) is constructed to represent the waveform as a stochastic generative model with a linear topology corresponding to a sequence of samples. The Viterbi algorithm is employed to simultaneously find the optimal nonlinear alignment between a model waveform and the seismic data, and to assign a score to each candidate spike train. The most probable travel times and amplitudes are inferred from the alignments of the highest scoring models. Our analyses show that the method can resolve closely spaced arrivals below traditional resolution limits and that travel time estimates are robust in the presence of random noise and source wavelet errors. We applied the VSSD method to constrain the elastic properties of a ultralow- velocity zone (ULVZ) at the core-mantle boundary beneath the Coral Sea. We analyzed vertical component short period ScP waveforms for 16 earthquakes occurring in the Tonga-Fiji trench recorded at the Alice Springs Array (ASAR) in central Australia. These waveforms show strong pre and postcursory seismic arrivals consistent with ULVZ layering. We used the VSSD method to measure differential travel-times and amplitudes of the post-cursor arrival ScSP and the precursor arrival SPcP relative to ScP. We compare our measurements to a database of approximately 340,000 synthetic seismograms finding that these data are best fit by a ULVZ model with an S-wave velocity reduction of 24%, a P-wave velocity reduction of 23%, a thickness of 8.5 km, and a density increase of 6%. We simultaneously constrain both P- and S-wave velocity reductions as a 1:1 ratio inside this ULVZ. This 1:1 ratio is not consistent with a partial melt origin to ULVZs. Rather, we demonstrate that a compositional origin is more likely.

  11. Compensatory changes in CYP expression in three different toxicology mouse models: CAR-null, Cyp3a-null, and Cyp2b9/10/13-null mice

    PubMed Central

    Kumar, Ramiya; Mota, Linda C.; Litoff, Elizabeth J.; Rooney, John P.; Boswell, W. Tyler; Courter, Elliott; Henderson, Charles M.; Hernandez, Juan P.; Corton, J. Christopher; Moore, David D.

    2017-01-01

    Targeted mutant models are common in mechanistic toxicology experiments investigating the absorption, metabolism, distribution, or elimination (ADME) of chemicals from individuals. Key models include those for xenosensing transcription factors and cytochrome P450s (CYP). Here we investigated changes in transcript levels, protein expression, and steroid hydroxylation of several xenobiotic detoxifying CYPs in constitutive androstane receptor (CAR)-null and two CYP-null mouse models that have subfamily members regulated by CAR; the Cyp3a-null and a newly described Cyp2b9/10/13-null mouse model. Compensatory changes in CYP expression that occur in these models may also occur in polymorphic humans, or may complicate interpretation of ADME studies performed using these models. The loss of CAR causes significant changes in several CYPs probably due to loss of CAR-mediated constitutive regulation of these CYPs. Expression and activity changes include significant repression of Cyp2a and Cyp2b members with corresponding drops in 6α- and 16β-testosterone hydroxylase activity. Further, the ratio of 6α-/15α-hydroxylase activity, a biomarker of sexual dimorphism in the liver, indicates masculinization of female CAR-null mice, suggesting a role for CAR in the regulation of sexually dimorphic liver CYP profiles. The loss of Cyp3a causes fewer changes than CAR. Nevertheless, there are compensatory changes including gender-specific increases in Cyp2a and Cyp2b. Cyp2a and Cyp2b were down-regulated in CAR-null mice, suggesting activation of CAR and potentially PXR following loss of the Cyp3a members. However, the loss of Cyp2b causes few changes in hepatic CYP transcript levels and almost no significant compensatory changes in protein expression or activity with the possible exception of 6α-hydroxylase activity. This lack of a compensatory response in the Cyp2b9/10/13-null mice is probably due to low CYP2B hepatic expression, especially in male mice. Overall, compensatory and regulatory CYP changes followed the order CAR-null > Cyp3a-null > Cyp2b-null mice. PMID:28350814

  12. Compensatory changes in CYP expression in three different toxicology mouse models: CAR-null, Cyp3a-null, and Cyp2b9/10/13-null mice.

    PubMed

    Kumar, Ramiya; Mota, Linda C; Litoff, Elizabeth J; Rooney, John P; Boswell, W Tyler; Courter, Elliott; Henderson, Charles M; Hernandez, Juan P; Corton, J Christopher; Moore, David D; Baldwin, William S

    2017-01-01

    Targeted mutant models are common in mechanistic toxicology experiments investigating the absorption, metabolism, distribution, or elimination (ADME) of chemicals from individuals. Key models include those for xenosensing transcription factors and cytochrome P450s (CYP). Here we investigated changes in transcript levels, protein expression, and steroid hydroxylation of several xenobiotic detoxifying CYPs in constitutive androstane receptor (CAR)-null and two CYP-null mouse models that have subfamily members regulated by CAR; the Cyp3a-null and a newly described Cyp2b9/10/13-null mouse model. Compensatory changes in CYP expression that occur in these models may also occur in polymorphic humans, or may complicate interpretation of ADME studies performed using these models. The loss of CAR causes significant changes in several CYPs probably due to loss of CAR-mediated constitutive regulation of these CYPs. Expression and activity changes include significant repression of Cyp2a and Cyp2b members with corresponding drops in 6α- and 16β-testosterone hydroxylase activity. Further, the ratio of 6α-/15α-hydroxylase activity, a biomarker of sexual dimorphism in the liver, indicates masculinization of female CAR-null mice, suggesting a role for CAR in the regulation of sexually dimorphic liver CYP profiles. The loss of Cyp3a causes fewer changes than CAR. Nevertheless, there are compensatory changes including gender-specific increases in Cyp2a and Cyp2b. Cyp2a and Cyp2b were down-regulated in CAR-null mice, suggesting activation of CAR and potentially PXR following loss of the Cyp3a members. However, the loss of Cyp2b causes few changes in hepatic CYP transcript levels and almost no significant compensatory changes in protein expression or activity with the possible exception of 6α-hydroxylase activity. This lack of a compensatory response in the Cyp2b9/10/13-null mice is probably due to low CYP2B hepatic expression, especially in male mice. Overall, compensatory and regulatory CYP changes followed the order CAR-null > Cyp3a-null > Cyp2b-null mice.

  13. A model based bayesian solution for characterization of complex damage scenarios in aerospace composite structures.

    PubMed

    Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J

    2018-01-01

    Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Faraday dispersion functions of galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ideguchi, Shinsuke; Tashiro, Yuichi; Takahashi, Keitaro

    2014-09-01

    The Faraday dispersion function (FDF), which can be derived from an observed polarization spectrum by Faraday rotation measure synthesis, is a profile of polarized emissions as a function of Faraday depth. We study intrinsic FDFs along sight lines through face-on Milky Way like galaxies by means of a sophisticated galactic model incorporating three-dimensional MHD turbulence, and investigate how much information the FDF intrinsically contains. Since the FDF reflects distributions of thermal and cosmic-ray electrons as well as magnetic fields, it has been expected that the FDF could be a new probe to examine internal structures of galaxies. We, however, findmore » that an intrinsic FDF along a sight line through a galaxy is very complicated, depending significantly on actual configurations of turbulence. We perform 800 realizations of turbulence and find no universal shape of the FDF even if we fix the global parameters of the model. We calculate the probability distribution functions of the standard deviation, skewness, and kurtosis of FDFs and compare them for models with different global parameters. Our models predict that the presence of vertical magnetic fields and the large-scale height of cosmic-ray electrons tend to make the standard deviation relatively large. In contrast, the differences in skewness and kurtosis are relatively less significant.« less

  15. Recent results and persisting problems in modeling flow induced coalescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fortelný, I., E-mail: fortelny@imc.cas.cz, E-mail: juza@imc.cas.cz; Jza, J., E-mail: fortelny@imc.cas.cz, E-mail: juza@imc.cas.cz

    2014-05-15

    The contribution summarizes recent results of description of the flow induced coalescence in immiscible polymer blends and addresses problems that call for which solving. The theory of coalescence based on the switch between equations for matrix drainage between spherical or deformed droplets provides a good agreement with more complicated modeling and available experimental data for probability, P{sub c}, that the collision of droplets will be followed by their fusion. A new equation for description of the matrix drainage between deformed droplets, applicable to the whole range of viscosity ratios, p, of the droplets and matrixes, is proposed. The theory facilitatesmore » to consider the effect of the matrix elasticity on coalescence. P{sub c} decreases with the matrix relaxation time but this decrease is not pronounced for relaxation times typical of most commercial polymers. Modeling of the flow induced coalescence in concentrated systems is needed for prediction of the dependence of coalescence rate on volume fraction of droplets. The effect of the droplet anisometry on P{sub c} should be studied for better understanding the coalescence in flow field with high and moderate deformation rates. A reliable description of coalescence in mixing and processing devices requires proper modeling of complex flow fields.« less

  16. Laparoscopy Improves Short-term Outcomes After Surgery for Diverticular Disease

    PubMed Central

    RUSS, ANDREW J.; OBMA, KARI L.; RAJAMANICKAM, VICTORIA; WAN, YIN; HEISE, CHARLES P.; FOLEY, EUGENE F.; HARMS, BRUCE; KENNEDY, GREGORY D.

    2012-01-01

    BACKGROUND & AIMS Observational studies and small randomized controlled trials have shown that the use of laparoscopy in colon resection for diverticular disease is feasible and results in fewer complications. We analyzed data from a large, prospectively maintained, multicenter database (National Surgical Quality Initiative Program) to determine whether the use of laparoscopy in the elective treatment of diverticular disease decreases rates of complications compared with open surgery, independent of preoperative comorbid factors. METHODS The analysis included data from 6970 patients who underwent elective surgeries for diverticular disease from 2005 to 2008. Patients with diverticular disease were identified by International Classification of Diseases, 9th revision codes and then categorized into open or laparoscopic groups based on Current Procedural Terminology codes. Preoperative, intraoperative, and postoperative data were analyzed to determine factors associated with increased risk for postoperative complications. RESULTS Data were analyzed from 3468 patients who underwent open surgery and 3502 patients who underwent laparoscopic procedures. After correcting for probability of morbidity, American Society of Anesthesiology class, and ostomy creation, overall complications (including superficial surgical site infections, deep incisional surgical site infections, sepsis, and septic shock) occurred with significantly lower incidence among patients who underwent laparoscopic procedures compared with those who received open operations. CONCLUSIONS The use of laparoscopy for treating diverticular disease, in the absence of absolute contraindications, results in fewer postoperative complications compared with open surgery. PMID:20193685

  17. COST-EFFECTIVENESS OF STRUCTURED EDUCATION IN CHILDREN WITH TYPE-1 DIABETES MELLITUS.

    PubMed

    Basarir, Hasan; Brennan, Alan; Jacques, Richard; Pollard, Daniel; Stevens, Katherine; Freeman, Jennifer; Wales, Jerry; Price, Katherine

    2016-01-01

    Kids in Control OF Food (KICk-OFF) is a 5-day structured education program for 11- to 16-year-olds with type 1 diabetes mellitus (T1DM) who are using multiple daily insulin injections. This study evaluates the cost-effectiveness of the KICk-OFF education program compared with the usual care using data from the KICk-OFF trial. The short-term within-trial analysis covers the 2-year postintervention period. Data on glycated hemoglobin (HbA1c), severe hypoglycemia, and diabetic ketoacidosis (DKA) were collected over a 2-year follow-up period. Sub-group analyses have been defined on the basis of baseline HbA1c being below 7.5 percent (58.5 mmol/mol) (low group), between 7.5 percent and 9.5 percent (80.3 mmol/mol) (medium group), and over 9.5 percent (high group). The long-term cost-effectiveness evaluation has been conducted by using The Sheffield Type 1 Diabetes Policy Model, which is a patient-level simulation model on T1DM. It includes long-term microvascular (retinopathy, neuropathy, and nephropathy) and macrovascular (myocardial infarction, stroke, revascularization, and angina) diabetes-related complications and acute adverse events (severe hypoglycemia and DKA). The most favorable within-trial scenario for the KICk-OFF arm led to an incremental cost-effectiveness ratio (ICER) of £23,688 (base year 2009) with a cost-effectiveness probability of 41.3 percent. Simulating the long-term complications using the full cohort data, the mean ICER for the base case was £28,813 (base year 2011) and the probability of the KICk-OFF intervention being cost-effective at £20,000/QALY threshold was 42.6 percent, with considerable variation due to treatment effect duration. For the high HbA1c sub-group, the KICk-OFF arm was "dominant" (meaning it provided better health gains at lower costs than usual care) over the usual care arm in each scenario considered. For the whole study population, the cost-effectiveness of KICk-OFF depends on the assumption for treatment effect duration. For the high baseline HbA1c sub-group, KICk-OFF arm was estimated to be dominant over the usual care arm regardless of the assumption on the treatment effect duration.

  18. The probability of growth of Listeria monocytogenes in cooked salmon and tryptic soy broth as affected by salt, smoke compound, and storage temperature.

    PubMed

    Hwang, Cheng-An

    2009-05-01

    The objectives of this study were to examine and model the probability of growth of Listeria monocytogenes in cooked salmon containing salt and smoke (phenol) compound and stored at various temperatures. A growth probability model was developed, and the model was compared to a model developed from tryptic soy broth (TSB) to assess the possibility of using TSB as a substitute for salmon. A 6-strain mixture of L. monocytogenes was inoculated into minced cooked salmon and TSB containing 0-10% NaCl and 0-34 ppm phenol to levels of 10(2-3) cfu/g, and the samples were vacuum-packed and stored at 0--25 degrees C for up to 42 days. A total 32 treatments, each with 16 samples, selected by central composite designs were tested. A logistic regression was used to model the probability of growth of L. monocytogenes as a function of concentrations of salt and phenol, and storage temperature. Resulted models showed that the probabilities of growth of L. monocytogenes in both salmon and TSB decreased when the salt and/or phenol concentrations increased, and at lower storage temperatures. In general, the growth probabilities of L. monocytogenes were affected more profoundly by salt and storage temperature than by phenol. The growth probabilities of L. monocytogenes estimated by the TSB model were higher than those by the salmon model at the same salt/phenol concentrations and storage temperatures. The growth probabilities predicted by the salmon and TSB models were comparable at higher storage temperatures, indicating the potential use of TSB as a model system to substitute salmon in studying the growth behavior of L. monocytogenes may only be suitable when the temperatures of interest are in higher storage temperatures (e.g., >12 degrees C). The model for salmon demonstrated the effects of salt, phenol, and storage temperature and their interactions on the growth probabilities of L. monocytogenes, and may be used to determine the growth probability of L. monocytogenes in smoked seafood.

  19. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  20. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    USGS Publications Warehouse

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.

    2009-01-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  1. Designing occupancy studies when false-positive detections occur

    USGS Publications Warehouse

    Clement, Matthew

    2016-01-01

    1.Recently, estimators have been developed to estimate occupancy probabilities when false-positive detections occur during presence-absence surveys. Some of these estimators combine different types of survey data to improve estimates of occupancy. With these estimators, there is a tradeoff between the number of sample units surveyed, and the number and type of surveys at each sample unit. Guidance on efficient design of studies when false positives occur is unavailable. 2.For a range of scenarios, I identified survey designs that minimized the mean square error of the estimate of occupancy. I considered an approach that uses one survey method and two observation states and an approach that uses two survey methods. For each approach, I used numerical methods to identify optimal survey designs when model assumptions were met and parameter values were correctly anticipated, when parameter values were not correctly anticipated, and when the assumption of no unmodelled detection heterogeneity was violated. 3.Under the approach with two observation states, false positive detections increased the number of recommended surveys, relative to standard occupancy models. If parameter values could not be anticipated, pessimism about detection probabilities avoided poor designs. Detection heterogeneity could require more or fewer repeat surveys, depending on parameter values. If model assumptions were met, the approach with two survey methods was inefficient. However, with poor anticipation of parameter values, with detection heterogeneity, or with removal sampling schemes, combining two survey methods could improve estimates of occupancy. 4.Ignoring false positives can yield biased parameter estimates, yet false positives greatly complicate the design of occupancy studies. Specific guidance for major types of false-positive occupancy models, and for two assumption violations common in field data, can conserve survey resources. This guidance can be used to design efficient monitoring programs and studies of species occurrence, species distribution, or habitat selection, when false positives occur during surveys.

  2. Periprosthetic Joint Infections: Clinical and Bench Research

    PubMed Central

    Legout, Laurence; Senneville, Eric

    2013-01-01

    Prosthetic joint infection is a devastating complication with high morbidity and substantial cost. The incidence is low but probably underestimated. Despite a significant basic and clinical research in this field, many questions concerning the definition of prosthetic infection as well the diagnosis and the management of these infections remained unanswered. We review the current literature about the new diagnostic methods, the management and the prevention of prosthetic joint infections. PMID:24288493

  3. [Gene method for inconsistent hydrological frequency calculation. I: Inheritance, variability and evolution principles of hydrological genes].

    PubMed

    Xie, Ping; Wu, Zi Yi; Zhao, Jiang Yan; Sang, Yan Fang; Chen, Jie

    2018-04-01

    A stochastic hydrological process is influenced by both stochastic and deterministic factors. A hydrological time series contains not only pure random components reflecting its inheri-tance characteristics, but also deterministic components reflecting variability characteristics, such as jump, trend, period, and stochastic dependence. As a result, the stochastic hydrological process presents complicated evolution phenomena and rules. To better understand these complicated phenomena and rules, this study described the inheritance and variability characteristics of an inconsistent hydrological series from two aspects: stochastic process simulation and time series analysis. In addition, several frequency analysis approaches for inconsistent time series were compared to reveal the main problems in inconsistency study. Then, we proposed a new concept of hydrological genes origined from biological genes to describe the inconsistent hydrolocal processes. The hydrologi-cal genes were constructed using moments methods, such as general moments, weight function moments, probability weight moments and L-moments. Meanwhile, the five components, including jump, trend, periodic, dependence and pure random components, of a stochastic hydrological process were defined as five hydrological bases. With this method, the inheritance and variability of inconsistent hydrological time series were synthetically considered and the inheritance, variability and evolution principles were fully described. Our study would contribute to reveal the inheritance, variability and evolution principles in probability distribution of hydrological elements.

  4. Severe Hypertriglyceridemia Induced by Sirolimus Treated With Medical Management Without Plasmapheresis: A Case Report.

    PubMed

    Kido, Kazuhiko; Evans, Rickey A; Gopinath, Anil; Flynn, Jeremy D

    2018-02-01

    Hypertriglyceridemia and hyperlipidemia are the most remarkable metabolic complications seen with long-term sirolimus therapy. We report the case of a 36-year-old woman status post bilateral lung transplantation on a maintenance immunosuppression regimen of sirolimus, tacrolimus, and prednisone who presented with status migrainosus, chest pain, abdominal discomfort, and triglyceride levels greater than 4425 mg/dL. In previously reported cases of severe hypertriglyceridemia that developed on maintenance sirolimus therapy, plasmapheresis has been utilized as an early strategy to rapidly lower triglycerides in order to minimize the risk of acute complications such as pancreatitis, but our case was managed medically without plasmapheresis. The most recent triglyceride was down to 520 mg/dL 2 months after discontinuation of sirolimus. We estimate the probability of this reaction to sirolimus as probable based on a score of 5 points on the Naranjo scale. This is the first case report to our knowledge that highlights the sole use of oral lipid-lowering drug agents to treat severe hypertriglyceridemia secondary to sirolimus without the use of plasmapheresis. Sirolimus-induced severe hypertriglyceridemia can be managed with oral lipid-lowering agents without plasmapheresis. Clinician needs to be aware of the importance of baseline and regular triglyceride monitoring in patients on sirolimus.

  5. Convergence of Transition Probability Matrix in CLVMarkov Models

    NASA Astrophysics Data System (ADS)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  6. The alfa and beta of tumours: a review of parameters of the linear-quadratic model, derived from clinical radiotherapy studies.

    PubMed

    van Leeuwen, C M; Oei, A L; Crezee, J; Bel, A; Franken, N A P; Stalpers, L J A; Kok, H P

    2018-05-16

    Prediction of radiobiological response is a major challenge in radiotherapy. Of several radiobiological models, the linear-quadratic (LQ) model has been best validated by experimental and clinical data. Clinically, the LQ model is mainly used to estimate equivalent radiotherapy schedules (e.g. calculate the equivalent dose in 2 Gy fractions, EQD 2 ), but increasingly also to predict tumour control probability (TCP) and normal tissue complication probability (NTCP) using logistic models. The selection of accurate LQ parameters α, β and α/β is pivotal for a reliable estimate of radiation response. The aim of this review is to provide an overview of published values for the LQ parameters of human tumours as a guideline for radiation oncologists and radiation researchers to select appropriate radiobiological parameter values for LQ modelling in clinical radiotherapy. We performed a systematic literature search and found sixty-four clinical studies reporting α, β and α/β for tumours. Tumour site, histology, stage, number of patients, type of LQ model, radiation type, TCP model, clinical endpoint and radiobiological parameter estimates were extracted. Next, we stratified by tumour site and by tumour histology. Study heterogeneity was expressed by the I 2 statistic, i.e. the percentage of variance in reported values not explained by chance. A large heterogeneity in LQ parameters was found within and between studies (I 2  > 75%). For the same tumour site, differences in histology partially explain differences in the LQ parameters: epithelial tumours have higher α/β values than adenocarcinomas. For tumour sites with different histologies, such as in oesophageal cancer, the α/β estimates correlate well with histology. However, many other factors contribute to the study heterogeneity of LQ parameters, e.g. tumour stage, type of LQ model, TCP model and clinical endpoint (i.e. survival, tumour control and biochemical control). The value of LQ parameters for tumours as published in clinical radiotherapy studies depends on many clinical and methodological factors. Therefore, for clinical use of the LQ model, LQ parameters for tumour should be selected carefully, based on tumour site, histology and the applied LQ model. To account for uncertainties in LQ parameter estimates, exploring a range of values is recommended.

  7. Detection probability of least tern and piping plover chicks in a large river system

    USGS Publications Warehouse

    Roche, Erin A.; Shaffer, Terry L.; Anteau, Michael J.; Sherfy, Mark H.; Stucker, Jennifer H.; Wiltermuth, Mark T.; Dovichin, Colin M.

    2014-01-01

    Monitoring the abundance and stability of populations of conservation concern is often complicated by an inability to perfectly detect all members of the population. Mark-recapture offers a flexible framework in which one may identify factors contributing to imperfect detection, while at the same time estimating demographic parameters such as abundance or survival. We individually color-marked, recaptured, and re-sighted 1,635 federally listed interior least tern (Sternula antillarum; endangered) chicks and 1,318 piping plover (Charadrius melodus; threatened) chicks from 2006 to 2009 at 4 study areas along the Missouri River and investigated effects of observer-, subject-, and site-level covariates suspected of influencing detection. Increasing the time spent searching and crew size increased the probability of detecting both species regardless of study area and detection methods were not associated with decreased survival. However, associations between detection probability and the investigated covariates were highly variable by study area and species combinations, indicating that a universal mark-recapture design may not be appropriate.

  8. Informational need of emotional stress

    NASA Astrophysics Data System (ADS)

    Simonov, P. V.; Frolov, M. V.

    According to the informational theory of emotions[1], emotions in humans depend on the power of some need (motivation) and the estimation by the subject of the probability (possibility) of the need staisfaction (the goal achievement). Low probability of need satisfaction leads to negative emotions, actively minimized by the subject. Increased probability of satisfaction, as compared to earlier forecast, generates positive emotions, which the subject tries to maximize, i.e. to enhance, to prolong, to repeat. The informational theory of emotions encompasses their reflective function, the laws of their appearance, the regulatory significance of emotions, and their role in organization of behavior. The level of emotional stress influences the operator's performance. A decrease in the emotional tonus leads to drowsiness, lack of vigilance, missing of significant signals and to slower reactions. An extremely high stress level disorganizes the activity, complicates it with a trend toward incorrect actions and reactions to insignificant signals (false alarms). The neurophysiological mechanisms of the influence of emotions on perceptual activity and operator performance as well as the significance of individuality are discussed.

  9. A prospective analysis for prevalence of complications in Thai nontransfusion-dependent Hb E/β-thalassemia and α-thalassemia (Hb H disease).

    PubMed

    Ekwattanakit, Supachai; Siritanaratkul, Noppadol; Viprakasit, Vip

    2018-05-01

    Recently, complications in patients with nontransfusion-dependent thalassemia (NTDT), in particular those with β-thalassemia intermedia (β-TI), were found to be significantly different from those in patients with transfusion dependent thalassemia (TDT), mainly β-thalassemia major (β-TM). However, this information is rather limited in other forms of NTDT. In this prospective study, adult Thai NTDT patients were interviewed and clinically evaluated for thalassemia related complications. Fifty-seven NTDT patients (age 18-74 years), 59.6% Hb E/β-thalassemia and 40.4% Hb H disease, were recruited; 26.4% were splenectomized. The most common complications were gallstones (68.4%), osteoporosis (26.3%), and pulmonary hypertension (15.8%). Splenectomy was associated with higher rate of gallstones and serious infection (P = .001 and .052, respectively), consistent with a multivariate analysis (RR = 9.5, P = .044, and RR = 15.1, P = .043, respectively). In addition, a higher hemoglobin level was inversely associated with gallstones in both univariate and multivariate analyses (P = .01 and .022, respectively). Serum ferritin was associated with abnormal liver function (P = .002). In contrast to the previous study, the prevalence of thrombosis was less common in our population (1.7%), probably due to differences in transfusion therapy, ethnicity, and underlying genotypes. For the first time, this prospective study provided the current prevalence of NTDT related complications in a Southeast Asian population with a different underlying genetic basis compared with previous studies. Although individual prevalence of each complication might differ from other studies, several important clinical factors such as splenectomy, degree of anemia, and iron overload seem to be determining risks of developing these complications consistently across different ethnicities. © 2018 Wiley Periodicals, Inc.

  10. Poor Performance on a Preoperative Cognitive Screening Test Predicts Postoperative Complications in Older Orthopedic Surgical Patients.

    PubMed

    Culley, Deborah J; Flaherty, Devon; Fahey, Margaret C; Rudolph, James L; Javedan, Houman; Huang, Chuan-Chin; Wright, John; Bader, Angela M; Hyman, Bradley T; Blacker, Deborah; Crosby, Gregory

    2017-11-01

    The American College of Surgeons and the American Geriatrics Society have suggested that preoperative cognitive screening should be performed in older surgical patients. We hypothesized that unrecognized cognitive impairment in patients without a history of dementia is a risk factor for development of postoperative complications. We enrolled 211 patients 65 yr of age or older without a diagnosis of dementia who were scheduled for an elective hip or knee replacement. Patients were cognitively screened preoperatively using the Mini-Cog and demographic, medical, functional, and emotional/social data were gathered using standard instruments or review of the medical record. Outcomes included discharge to place other than home (primary outcome), delirium, in-hospital medical complications, hospital length-of-stay, 30-day emergency room visits, and mortality. Data were analyzed using univariate and multivariate analyses. Fifty of 211 (24%) patients screened positive for probable cognitive impairment (Mini-Cog less than or equal to 2). On age-adjusted multivariate analysis, patients with a Mini-Cog score less than or equal to 2 were more likely to be discharged to a place other than home (67% vs. 34%; odds ratio = 3.88, 95% CI = 1.58 to 9.55), develop postoperative delirium (21% vs. 7%; odds ratio = 4.52, 95% CI = 1.30 to 15.68), and have a longer hospital length of stay (hazard ratio = 0.63, 95% CI = 0.42 to 0.95) compared to those with a Mini-Cog score greater than 2. Many older elective orthopedic surgical patients have probable cognitive impairment preoperatively. Such impairment is associated with development of delirium postoperatively, a longer hospital stay, and lower likelihood of going home upon hospital discharge.

  11. Benefits of exercise for older adults. A review of existing evidence and current recommendations for the general population.

    PubMed

    Elward, K; Larson, E B

    1992-02-01

    Currently available data are of variable rigor and from a variety of sources, yet they do support several conclusions about the potential value of exercise for whole groups of elderly persons. (1) Exercise of moderate intensity may benefit many elderly persons in numerous and complementary ways (e.g., cardiovascular status, fracture risk, functional ability, and mental processing). (2) There are few complications associated with such increases in activity. Indeed, a remarkable aspect of research on exercise in the elderly has been the virtual absence of reports of serious cardiovascular or musculoskeletal complications in any published trials. Cardiac rehabilitation programs, enrolling many persons over 65 years of age with known coronary artery disease, also report few major cardiovascular complications. Thus, exercise should be viewed as safe for most older adults. (3) Exercise in the elderly probably needs to be tailored, and when possible, individualized, with the specific objectives of the person or group in mind. Some benefits are probably related to the intensity of exercise (e.g., cardiovascular disease), others due to the type of exercise (weight-bearing versus nonweight-bearing for osteoporosis, or racket sports for hand-eye coordination), and still others possibly relate to the setting in which exercise occurs (social-psychological benefits). (4) The authors believe the known physiologic effects of exercise on age-related changes and existing clinical research support the general notion that vigorous weight-bearing exercises such as walking are the safest, cheapest, easiest, and most widely beneficial for the average senior. (5) Research has yet to define good ways by which to stimulate large numbers of sedentary elderly persons to exercise regularly. The potential and complementary benefits appear to be great enough to justify widespread efforts at the community and individual level, however.

  12. Impact of thrombus length on recanalization and clinical outcome following mechanical thrombectomy in acute ischemic stroke.

    PubMed

    Seker, Fatih; Pfaff, Johannes; Wolf, Marcel; Schönenberger, Silvia; Nagel, Simon; Herweh, Christian; Pham, Mirko; Bendszus, Martin; Möhlenbruch, Markus A

    2017-10-01

    The impact of thrombus length on recanalization in IV thrombolysis for acute intracranial artery occlusion has been well studied. Here we analyzed the influence of thrombus length on the number of thrombectomy maneuvers needed for recanalization, intraprocedural complications, recanalization success, and clinical outcome after mechanical thrombectomy. We retrospectively analyzed angiographic and clinical data from 72 consecutive patients with acute occlusion of the M1 segment of the middle cerebral artery who were treated with mechanical thrombectomy using stent retrievers. Successful recanalization was defined as a Thrombolysis in Cerebral Infarction score of 2b or 3. Good neurological outcome was defined as a modified Rankin Scale score of ≤2 at 90 days after stroke onset. Mean thrombus length was 13.4±5.2 mm. Univariate binary logistic regression did not show an association of thrombus length with the probability of a good clinical outcome (OR 0.95, 95% CI 0.84 to 1.03, p=0.176) or successful recanalization (OR 0.92, 95% CI 0.81 to 1.05, p=0.225). There was no significant correlation between thrombus length and the number of thrombectomy maneuvers needed for recanalization (p=0.112). Furthermore, thrombus length was not correlated with the probability of intraprocedural complications (p=0.813), including embolization in a new territory (n=3). In this study, thrombus length had no relevant impact on recanalization, neurological outcome, or intraprocedural complications following mechanical thrombectomy of middle cerebral artery occlusions. Therefore, mechanical thrombectomy with stent retrievers can be attempted with large clots. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Ipsilateral femoral autograft reconstruction after resection of a pelvic tumor.

    PubMed

    Biau, David J; Thévenin, Fabrice; Dumaine, Valérie; Babinet, Antoine; Tomeno, Bernard; Anract, Philippe

    2009-01-01

    Reconstruction of bone after the resection of a pelvic tumor is challenging. The purpose of the present study was to evaluate the use of the ipsilateral femur as the graft material for reconstruction. We performed a retrospective review of thirteen patients with a malignant pelvic lesion who underwent resection followed by reconstruction with an ipsilateral femoral autograft and insertion of a total hip replacement. The study group included nine men and four women with a median age of fifty-one years at the time of the reconstruction. The diagnosis was chondrosarcoma in eight patients, metastasis in three, and myeloma and radiation-induced malignant disease in one each. The surviving patients were assessed functionally and radiographically; the cumulative probability of revision was estimated while taking into account competing risks. The median duration of follow-up was forty-nine months. At the time of the latest follow-up, seven patients were alive and disease-free and six had died from metastatic disease. Four patients had had revision of the reconstruction, two for the treatment of mechanical complications and two for the treatment of infection. Three other patients had mechanical complications but had not had a revision. The cumulative probability of revision of the reconstruction for mechanical failure was 8% (95% confidence interval, 0% to 23%), 8% (95% confidence interval, 0% to 23%), and 16% (95% confidence interval, 0% to 39%) at one, two, and four years, respectively. Although it has attendant complications consistent with pelvic tumor surgery, an ipsilateral femoral autograft reconstruction may be an option for reconstruction of pelvic discontinuity in a subgroup of patients following tumor resection. This innovative procedure requires longer-term follow-up studies.

  14. [Survival analysis of patients with pneumoconiosis from 1956 to 2010 in Changsha].

    PubMed

    Xue, Jing; Chen, Lizhang

    2012-01-01

    To investigate the survival rate and life expectancy of patients with pneumoconiosis and influence factors in Changsha from 1956 to 2010. A total of 3685 patients with pneumoconiosis were diagnosed and reported from 1956 to 2010 in Changsha. The fatality rate and life expectancy were analyzed by life table and the cause of death was analyzed by Kaplan-Meier method and Cox regression model. The death rate increased obviously with age. Age and accumulation death probability showed linearity (Ŷ=1.271+0.041X, r=0.989). The life expectancy was 60.12 years. The first cause of death was pulmonary tuberculosis in patients with pneumoconiosis. Ruling out the influence of pulmonary tuberculosis, pneumoconiosis, and lung source heart disease, the life expectancy of patients with pneumoconiosis averagely extended 0.83, 0.99, and 0.02 years. The death rate of pneumoconiosis-tuberculosis had significant difference with that of the pneumoconiosisnontuberculosis (P<0.01). Cox regression analysis revealed that the main risk factors for the survival of patients with pneumoconiosis included type of work (smashing worker), complication with tuberculosis, type of pneumoconiosis (silicosis). The death hazard ratio or relative risk caused by them was 1.927, 1.749, and 1.609, respectively. Prevention of pneumoconiosis should focus on smashing workers in Changsha, while its the treatment primarily attaches importance to complication of tuberculosis and lung infection.

  15. Patient- and therapy-related factors associated with the incidence of xerostomia in nasopharyngeal carcinoma patients receiving parotid-sparing helical tomotherapy

    PubMed Central

    Lee, Tsair-Fwu; Liou, Ming-Hsiang; Ting, Hui-Min; Chang, Liyun; Lee, Hsiao-Yi; Wan Leung, Stephen; Huang, Chih-Jen; Chao, Pei-Ju

    2015-01-01

    We investigated the incidence of moderate to severe patient-reported xerostomia among nasopharyngeal carcinoma (NPC) patients treated with helical tomotherapy (HT) and identified patient- and therapy-related factors associated with acute and chronic xerostomia toxicity. The least absolute shrinkage and selection operator (LASSO) normal tissue complication probability (NTCP) models were developed using quality-of-life questionnaire datasets from 67 patients with NPC. For acute toxicity, the dosimetric factors of the mean doses to the ipsilateral submandibular gland (Dis) and the contralateral submandibular gland (Dcs) were selected as the first two significant predictors. For chronic toxicity, four predictive factors were selected: age, mean dose to the oral cavity (Doc), education, and T stage. The substantial sparing data can be used to avoid xerostomia toxicity. We suggest that the tolerance values corresponded to a 20% incidence of complications (TD20) for Dis = 39.0 Gy, Dcs = 38.4 Gy, and Doc = 32.5 Gy, respectively, when mean doses to the parotid glands met the QUANTEC 25 Gy sparing guidelines. To avoid patient-reported xerostomia toxicity, the mean doses to the parotid gland, submandibular gland, and oral cavity have to meet the sparing tolerance, although there is also a need to take inherent patient characteristics into consideration. PMID:26289304

  16. [Late onset, non-infectious pulmonary complications after haematological stem cell transplantation].

    PubMed

    Bergeron, A; Feuillet, S; Meignin, V; Socie, G; Tazi, A

    2008-02-01

    Non infectious pulmonary complications which frequently occur in the late follow-up of haemopoietic stem cell transplant (HSCT) recipients account for an increase in mortality and morbidity. Different histological entities have been described among which bronchiolitis obliterans is the most common. Because of the absence of prospective epidemiological studies and the difficulties in obtaining surgical lung biopsies from these frail patients little is known about these conditions. Although their pathogenesis is poorly understood they probably result from a chronic pulmonary graft versus host disease (GVHD). The introduction of or increase in systemic immunosuppressive treatment, usually indicated for controlling extra-thoracic manifestations of GVHD, may lead to the resolution of an organising pneumonia but is usually ineffective in the treatment of bronchiolitis obliterans. Current prospective cohort studies together with randomised prospective studies evaluating more targeted treatments should help determine the frequency, the risk factors and the precise characteristics of the different entities of late non-infectious pulmonary diseases following HSCT and should also improve their management. Furthermore, the recent demonstration of lung abnormalities in animal models of chronic GVHD, similar to those observed in humans, should allow a better understanding of the pathogenesis. The prevalence of these diseases is increasing throughout the world. More precise analysis, the identification of risk factors and study of the pathophysiological mechanisms involved should allow better understanding and management than at present.

  17. Can abdominal surgical emergencies be treated in an ambulatory setting?

    PubMed

    Genser, L; Vons, C

    2015-12-01

    The performance of emergency abdominal surgery in an outpatient setting is increasingly the order of the day in France. This review evaluates the feasibility and reliability of ambulatory surgical treatment of the most common abdominal emergencies: appendectomy for acute appendicitis and cholecystectomy for acute complications of gallstone disease (acute cholecystitis and gallstone pancreatitis). This study evaluates surgical procedures performed on an ambulatory basis according to the international definition (admission in the morning, discharge in the evening with a hospital stay of less than 12 hours). Just as for elective surgery, eligibility of patients for an ambulatory approach depends on the capacities of the surgical and anesthesia team: to manage the risks, particularly the risk of deferring surgery until the morning); to prevent or treat post-operative symptoms like pain, nausea, vomiting, re-ambulation in order to permit rapid post-operative discharge. Recent studies have shown that appendectomy for non-complicated acute appendicitis can be deferred for up to 12 hours without any increase in danger. Many other studies have shown that early discharge after appendectomy for acute non-complicated appendicitis is feasible and safe. Nonetheless, there is only one published series of truly ambulatory appendectomies. The results were excellent. Patients who presented in the afternoon were brought back for operation the following morning. The appropriate timing for performance of cholecystectomy in patients with acute calculous cholecystitis or gallstone pancreatitis has not been well defined, but is always somewhat delayed relative to the onset of symptoms. To minimize operative complications, cholecystectomy for acute calculous cholecystitis should probably be performed between 24 and 72 hours after diagnosis. Cholecystectomy for gallstone pancreatitis should probably not be delayed longer than a week; the need to keep the patient hospitalized during the interval has not been demonstrated. Early discharge after cholecystectomy was usually possible, even in series where acute cholecystitis was diagnosed intra-operatively. Cholecystectomy for acute cholecystitis and gallstone pancreatitis seems to be feasible but no reports specifically support this approach. Emergency abdominal surgery seems to be feasible on an ambulatory setting for non-complicated acute appendicitis, acute calculous cholecystitis and gallstone pancreatitis. Only a single French series on ambulatory appendectomy for acute appendicitis has been reported. Copyright © 2015. Published by Elsevier Masson SAS.

  18. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  19. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  20. Tornado damage risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinhold, T.A.; Ellingwood, B.

    1982-09-01

    Several proposed models were evaluated for predicting tornado wind speed probabilities at nuclear plant sites as part of a program to develop statistical data on tornadoes needed for probability-based load combination analysis. A unified model was developed which synthesized the desired aspects of tornado occurrence and damage potential. The sensitivity of wind speed probability estimates to various tornado modeling assumptions are examined, and the probability distributions of tornado wind speed that are needed for load combination studies are presented.

Top