Sample records for statistical failure analysis

  1. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  2. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  3. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  4. Analysis of Emergency Diesel Generators Failure Incidents in Nuclear Power Plants

    NASA Astrophysics Data System (ADS)

    Hunt, Ronderio LaDavis

    In early years of operation, emergency diesel generators have had a minimal rate of demand failures. Emergency diesel generators are designed to operate as a backup when the main source of electricity has been disrupted. As of late, EDGs (emergency diesel generators) have been failing at NPPs (nuclear power plants) around the United States causing either station blackouts or loss of onsite and offsite power. These failures occurred from a specific type called demand failures. This thesis evaluated the current problem that raised concern in the nuclear industry which was averaging 1 EDG demand failure/year in 1997 to having an excessive event of 4 EDG demand failure year which occurred in 2011. To determine the next occurrence of the extreme event and possible cause to an event of such happening, two analyses were conducted, the statistical and root cause analysis. Considering the statistical analysis in which an extreme event probability approach was applied to determine the next occurrence year of an excessive event as well as, the probability of that excessive event occurring. Using the root cause analysis in which the potential causes of the excessive event occurred by evaluating, the EDG manufacturers, aging, policy changes/ maintenance practices and failure components. The root cause analysis investigated the correlation between demand failure data and historical data. Final results from the statistical analysis showed expectations of an excessive event occurring in a fixed range of probability and a wider range of probability from the extreme event probability approach. The root-cause analysis of the demand failure data followed historical statistics for the EDG manufacturer, aging and policy changes/ maintenance practices but, indicated a possible cause regarding the excessive event with the failure components. Conclusions showed the next excessive demand failure year, prediction of the probability and the next occurrence year of such failures, with an acceptable confidence level, was difficult but, it was likely that this type of failure will not be a 100 year event. It was noticeable to see that the majority of the EDG demand failures occurred within the main components as of 2005. The overall analysis of this study provided from percentages, indicated that it would be appropriate to make the statement that the excessive event was caused by the overall age (wear and tear) of the Emergency Diesel Generators in Nuclear Power Plants. Future Work will be to better determine the return period of the excessive event once the occurrence has happened for a second time by implementing the extreme event probability approach.

  5. Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual

    DTIC Science & Technology

    1974-10-31

    REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the

  6. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  7. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  8. Statistical analysis of field data for aircraft warranties

    NASA Astrophysics Data System (ADS)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  9. Preoperative radiation and free flap outcomes for head and neck reconstruction: a systematic review and meta-analysis.

    PubMed

    Herle, Pradyumna; Shukla, Lipi; Morrison, Wayne A; Shayan, Ramin

    2015-03-01

    There is a general consensus among reconstructive surgeons that preoperative radiotherapy is associated with a higher risk of flap failure and complications in head and neck surgery. Opinion is also divided regarding the effects of radiation dose on free flap outcomes and timing of preoperative radiation to minimize adverse outcomes. Our meta-analysis will attempt to address these issues. A systematic review of the literature was conducted in concordance to PRISMA protocol. Data were combined using STATA 12 and Open Meta-Analyst software programmes. Twenty-four studies were included comparing 2842 flaps performed in irradiated fields and 3491 flaps performed in non-irradiated fields. Meta-analysis yielded statistically significant risk ratios for flap failure (RR 1.48, P = 0.004), complications (RR 1.84, P < 0.001), reoperation (RR 2.06, P < 0.001) and fistula (RR 2.05, P < 0.001). Mean radiation dose demonstrated a trend towards increased risk of flap failure, but this was not statistically significant. On subgroup analysis, flaps with >60 Gy radiation had a non-statistically significant higher risk of flap failure (RR 1.61, P = 0.145). Preoperative radiation is associated with a statistically significant increased risk of flap complications, failure and fistula. Preoperative radiation in excess of 60 Gy after radiotherapy represents a potential risk factor for increased flap loss and should be avoided where possible. © 2014 Royal Australasian College of Surgeons.

  10. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  11. Statistical analysis of early failures in electromigration

    NASA Astrophysics Data System (ADS)

    Gall, M.; Capasso, C.; Jawarani, D.; Hernandez, R.; Kawasaki, H.; Ho, P. S.

    2001-07-01

    The detection of early failures in electromigration (EM) and the complicated statistical nature of this important reliability phenomenon have been difficult issues to treat in the past. A satisfactory experimental approach for the detection and the statistical analysis of early failures has not yet been established. This is mainly due to the rare occurrence of early failures and difficulties in testing of large sample populations. Furthermore, experimental data on the EM behavior as a function of varying number of failure links are scarce. In this study, a technique utilizing large interconnect arrays in conjunction with the well-known Wheatstone Bridge is presented. Three types of structures with a varying number of Ti/TiN/Al(Cu)/TiN-based interconnects were used, starting from a small unit of five lines in parallel. A serial arrangement of this unit enabled testing of interconnect arrays encompassing 480 possible failure links. In addition, a Wheatstone Bridge-type wiring using four large arrays in each device enabled simultaneous testing of 1920 interconnects. In conjunction with a statistical deconvolution to the single interconnect level, the results indicate that the electromigration failure mechanism studied here follows perfect lognormal behavior down to the four sigma level. The statistical deconvolution procedure is described in detail. Over a temperature range from 155 to 200 °C, a total of more than 75 000 interconnects were tested. None of the samples have shown an indication of early, or alternate, failure mechanisms. The activation energy of the EM mechanism studied here, namely the Cu incubation time, was determined to be Q=1.08±0.05 eV. We surmise that interface diffusion of Cu along the Al(Cu) sidewalls and along the top and bottom refractory layers, coupled with grain boundary diffusion within the interconnects, constitutes the Cu incubation mechanism.

  12. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  13. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System.

    PubMed

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure ( P <0.05). Multivariate logistic regression analysis showed no statistically significant relationship ( P >0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant ( P <0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y.

  14. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System

    PubMed Central

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027

  15. PV System Component Fault and Failure Compilation and Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Gooding, Renee Lynne

    This report describes data collection and analysis of solar photovoltaic (PV) equipment events, which consist of faults and fa ilures that occur during the normal operation of a distributed PV system or PV power plant. We present summary statistics from locations w here maintenance data is being collected at various intervals, as well as reliability statistics gathered from that da ta, consisting of fault/failure distributions and repair distributions for a wide range of PV equipment types.

  16. Statistics-related and reliability-physics-related failure processes in electronics devices and products

    NASA Astrophysics Data System (ADS)

    Suhir, E.

    2014-05-01

    The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.

  17. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  18. Study of deformation evolution during failure of rock specimens using laser-based vibration measurements

    NASA Astrophysics Data System (ADS)

    Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.

    2017-12-01

    The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.

  19. Statistical analysis of cascading failures in power grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chertkov, Michael; Pfitzner, Rene; Turitsyn, Konstantin

    2010-12-01

    We introduce a new microscopic model of cascading failures in transmission power grids. This model accounts for automatic response of the grid to load fluctuations that take place on the scale of minutes, when optimum power flow adjustments and load shedding controls are unavailable. We describe extreme events, caused by load fluctuations, which cause cascading failures of loads, generators and lines. Our model is quasi-static in the causal, discrete time and sequential resolution of individual failures. The model, in its simplest realization based on the Directed Current description of the power flow problem, is tested on three standard IEEE systemsmore » consisting of 30, 39 and 118 buses. Our statistical analysis suggests a straightforward classification of cascading and islanding phases in terms of the ratios between average number of removed loads, generators and links. The analysis also demonstrates sensitivity to variations in line capacities. Future research challenges in modeling and control of cascading outages over real-world power networks are discussed.« less

  20. Failure Mode Identification Through Clustering Analysis

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.

  1. Graft survival of diabetic versus nondiabetic donor tissue after initial keratoplasty.

    PubMed

    Vislisel, Jesse M; Liaboe, Chase A; Wagoner, Michael D; Goins, Kenneth M; Sutphin, John E; Schmidt, Gregory A; Zimmerman, M Bridget; Greiner, Mark A

    2015-04-01

    To compare corneal graft survival using tissue from diabetic and nondiabetic donors in patients undergoing initial Descemet stripping automated endothelial keratoplasty (DSAEK) or penetrating keratoplasty (PKP). A retrospective chart review of pseudophakic eyes that underwent DSAEK or PKP was performed. The primary outcome measure was graft failure. Cox proportional hazard regression and Kaplan-Meier survival analyses were used to compare diabetic versus nondiabetic donor tissue for all keratoplasty cases. A total of 183 eyes (136 DSAEK, 47 PKP) were included in the statistical analysis. Among 24 procedures performed using diabetic donor tissue, there were 4 cases (16.7%) of graft failure (3 DSAEK, 1 PKP), and among 159 procedures performed using nondiabetic donor tissue, there were 18 cases (11.3%) of graft failure (12 DSAEK, 6 PKP). Cox proportional hazard ratio of graft failure for all cases comparing diabetic with nondiabetic donor tissue was 1.69, but this difference was not statistically significant (95% confidence interval, 0.56-5.06; P = 0.348). There were no significant differences in Kaplan-Meier curves comparing diabetic with nondiabetic donor tissue for all cases (P = 0.380). Statistical analysis of graft failure by donor diabetes status within each procedure type was not possible because of the small number of graft failure events involving diabetic tissue. We found similar rates of graft failure in all keratoplasty cases when comparing tissue from diabetic and nondiabetic donors, but further investigation is needed to determine whether diabetic donor tissue results in different graft failure rates after DSAEK compared with PKP.

  2. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  3. Two-sample statistics for testing the equality of survival functions against improper semi-parametric accelerated failure time alternatives: an application to the analysis of a breast cancer clinical trial.

    PubMed

    Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry

    2004-06-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.

  4. Mechanistic Considerations Used in the Development of the PROFIT PCI Failure Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pankaskie, P. J.

    A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) Interactions (PC!) failure model for estimating the probability of failure in !ransient increases in power (PROFIT) was developed. PROFIT is based on 1) standard statistical methods applied to available PC! fuel failure data and 2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmentalmore » and strain-rate dependent strain energy absorption to failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-disloction interaction effects in the Zircaloy cladding. Assuming that the power ramping rate is the operating corollary of strain-rate in the Zircaloy cladding, then the variables of first order importance in the PCI fuel failure phenomenon are postulated to be: 1. pre-transient fuel rod power, P{sub I}, 2. transient increase in fuel rod power, {Delta}P, 3. fuel burnup, Bu, and 4. the constitutive material property of the Zircaloy cladding, SEAF.« less

  5. Progressive Failure And Life Prediction of Ceramic and Textile Composites

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Shi, Yucheng; Katikala, Madhu; Johnston, William M., Jr.; Card, Michael F.

    1998-01-01

    An engineering approach to predict the fatigue life and progressive failure of multilayered composite and textile laminates is presented. Analytical models which account for matrix cracking, statistical fiber failures and nonlinear stress-strain behavior have been developed for both composites and textiles. The analysis method is based on a combined micromechanics, fracture mechanics and failure statistics analysis. Experimentally derived empirical coefficients are used to account for the interface of fiber and matrix, fiber strength, and fiber-matrix stiffness reductions. Similar approaches were applied to textiles using Repeating Unit Cells. In composite fatigue analysis, Walker's equation is applied for matrix fatigue cracking and Heywood's formulation is used for fiber strength fatigue degradation. The analysis has been compared with experiment with good agreement. Comparisons were made with Graphite-Epoxy, C/SiC and Nicalon/CAS composite materials. For textile materials, comparisons were made with triaxial braided and plain weave materials under biaxial or uniaxial tension. Fatigue predictions were compared with test data obtained from plain weave C/SiC materials tested at AS&M. Computer codes were developed to perform the analysis. Composite Progressive Failure Analysis for Laminates is contained in the code CPFail. Micromechanics Analysis for Textile Composites is contained in the code MicroTex. Both codes were adapted to run as subroutines for the finite element code ABAQUS and CPFail-ABAQUS and MicroTex-ABAQUS. Graphic user interface (GUI) was developed to connect CPFail and MicroTex with ABAQUS.

  6. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  7. Blowout Prevention System Events and Equipment Component Failures : 2016 SafeOCS Annual Report

    DOT National Transportation Integrated Search

    2017-09-22

    The SafeOCS 2016 Annual Report, produced by the Bureau of Transportation Statistics (BTS), summarizes blowout prevention (BOP) equipment failures on marine drilling rigs in the Outer Continental Shelf. It includes an analysis of equipment component f...

  8. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  9. Risk management of key issues of FPSO

    NASA Astrophysics Data System (ADS)

    Sun, Liping; Sun, Hai

    2012-12-01

    Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.

  10. Bruxism and dental implant failures: a multilevel mixed effects parametric survival analysis approach.

    PubMed

    Chrcanovic, B R; Kisch, J; Albrektsson, T; Wennerberg, A

    2016-11-01

    Recent studies have suggested that the insertion of dental implants in patients being diagnosed with bruxism negatively affected the implant failure rates. The aim of the present study was to investigate the association between the bruxism and the risk of dental implant failure. This retrospective study is based on 2670 patients who received 10 096 implants at one specialist clinic. Implant- and patient-related data were collected. Descriptive statistics were used to describe the patients and implants. Multilevel mixed effects parametric survival analysis was used to test the association between bruxism and risk of implant failure adjusting for several potential confounders. Criteria from a recent international consensus (Lobbezoo et al., J Oral Rehabil, 40, 2013, 2) and from the International Classification of Sleep Disorders (International classification of sleep disorders, revised: diagnostic and coding manual, American Academy of Sleep Medicine, Chicago, 2014) were used to define and diagnose the condition. The number of implants with information available for all variables totalled 3549, placed in 994 patients, with 179 implants reported as failures. The implant failure rates were 13·0% (24/185) for bruxers and 4·6% (155/3364) for non-bruxers (P < 0·001). The statistical model showed that bruxism was a statistically significantly risk factor to implant failure (HR 3·396; 95% CI 1·314, 8·777; P = 0·012), as well as implant length, implant diameter, implant surface, bone quantity D in relation to quantity A, bone quality 4 in relation to quality 1 (Lekholm and Zarb classification), smoking and the intake of proton pump inhibitors. It is suggested that the bruxism may be associated with an increased risk of dental implant failure. © 2016 John Wiley & Sons Ltd.

  11. Two-Sample Statistics for Testing the Equality of Survival Functions Against Improper Semi-parametric Accelerated Failure Time Alternatives: An Application to the Analysis of a Breast Cancer Clinical Trial

    PubMed Central

    BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY

    2010-01-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627

  12. Quantifying the added value of BNP in suspected heart failure in general practice: an individual patient data meta-analysis.

    PubMed

    Kelder, Johannes C; Cowie, Martin R; McDonagh, Theresa A; Hardman, Suzanna M C; Grobbee, Diederick E; Cost, Bernard; Hoes, Arno W

    2011-06-01

    Diagnosing early stages of heart failure with mild symptoms is difficult. B-type natriuretic peptide (BNP) has promising biochemical test characteristics, but its diagnostic yield on top of readily available diagnostic knowledge has not been sufficiently quantified in early stages of heart failure. To quantify the added diagnostic value of BNP for the diagnosis of heart failure in a population relevant to GPs and validate the findings in an independent primary care patient population. Individual patient data meta-analysis followed by external validation. The additional diagnostic yield of BNP above standard clinical information was compared with ECG and chest x-ray results. Derivation was performed on two existing datasets from Hillingdon (n=127) and Rotterdam (n=149) while the UK Natriuretic Peptide Study (n=306) served as validation dataset. Included were patients with suspected heart failure referred to a rapid-access diagnostic outpatient clinic. Case definition was according to the ESC guideline. Logistic regression was used to assess discrimination (with the c-statistic) and calibration. Of the 276 patients in the derivation set, 30.8% had heart failure. The clinical model (encompassing age, gender, known coronary artery disease, diabetes, orthopnoea, elevated jugular venous pressure, crackles, pitting oedema and S3 gallop) had a c-statistic of 0.79. Adding, respectively, chest x-ray results, ECG results or BNP to the clinical model increased the c-statistic to 0.84, 0.85 and 0.92. Neither ECG nor chest x-ray added significantly to the 'clinical plus BNP' model. All models had adequate calibration. The 'clinical plus BNP' diagnostic model performed well in an independent cohort with comparable inclusion criteria (c-statistic=0.91 and adequate calibration). Using separate cut-off values for 'ruling in' (typically implying referral for echocardiography) and for 'ruling out' heart failure--creating a grey zone--resulted in insufficient proportions of patients with a correct diagnosis. BNP has considerable diagnostic value in addition to signs and symptoms in patients suspected of heart failure in primary care. However, using BNP alone with the currently recommended cut-off levels is not sufficient to make a reliable diagnosis of heart failure.

  13. rpsftm: An R Package for Rank Preserving Structural Failure Time Models

    PubMed Central

    Allison, Annabel; White, Ian R; Bond, Simon

    2018-01-01

    Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ, is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z(ψ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm. PMID:29564164

  14. rpsftm: An R Package for Rank Preserving Structural Failure Time Models.

    PubMed

    Allison, Annabel; White, Ian R; Bond, Simon

    2017-12-04

    Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ , is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z ( ψ ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm.

  15. Acoustic emission spectral analysis of fiber composite failure mechanisms

    NASA Technical Reports Server (NTRS)

    Egan, D. M.; Williams, J. H., Jr.

    1978-01-01

    The acoustic emission of graphite fiber polyimide composite failure mechanisms was investigated with emphasis on frequency spectrum analysis. Although visual examination of spectral densities could not distinguish among fracture sources, a paired-sample t statistical analysis of mean normalized spectral densities did provide quantitative discrimination among acoustic emissions from 10 deg, 90 deg, and plus or minus 45 deg, plus or minus 45 deg sub s specimens. Comparable discrimination was not obtained for 0 deg specimens.

  16. [Actuarial analysis of time-failure data and its rrelevance for interpretation of results. Audit of the journal "Strahlentherapie und Onkologie" (Radiotherapy and Oncology)].

    PubMed

    Dubben, H H; Beck-Bornholdt, H P

    2000-12-01

    The statistical quality of the contributions to "Strahlentherapie und Onkologie" is assessed, aiming for improvement of the journal and consequently its impact factor. All 181 articles published during 1998 and 1999 in the categories "review", "original contribution", and "short communication" were analyzed concerning actuarial analysis of time-failure data. One hundred and twenty-three publications without time-failure data were excluded from analysis. Forty-five of the remaining 58 publications with time-failure data were evaluated actuarially. This corresponds to 78% (95% confidence interval: 64 to 88%) of papers, in which data were adequately analyzed. Complications were reported in 16 of 58 papers, but in only 3 cases actuarially. The number of patients at risk during the course of follow-up was documented adequately in 22 of the 45 publications with actuarial analysis. Authors, peer reviewers, and editors could contribute to improve the quality of the journal by setting value on acturial analysis of time-failure data.

  17. Influence of Endodontic Treatment and Retreatment on the Fatigue Failure Load, Numbers of Cycles for Failure, and Survival Rates of Human Canine Teeth.

    PubMed

    Missau, Taiane; De Carlo Bello, Mariana; Michelon, Carina; Mastella Lang, Pauline; Kalil Pereira, Gabriel; Baldissara, Paolo; Valandro, Luiz Felipe; Souza Bier, Carlos Alexandre; Pivetta Rippe, Marília

    2017-12-01

    This study evaluated the effects of endodontic treatment and retreatment on the fatigue failure load, numbers of cycles for failure, and survival rates of canine teeth. Sixty extracted canine teeth, each with a single root canal, were selected and randomly divided into 4 groups (n = 15): untreated, teeth without endodontic intervention; prepared, teeth subjected only to rotary instrumentation; filled, teeth receiving complete endodontic treatment; and retreated, teeth retreated endodontically. After the different endodontic interventions, the specimens were subjected to fatigue testing by the stepwise method: 200 N (× 5000 load pulses), 300 N, 400 N, 500 N, 600 N, 800 N, and 900 N at a maximum of 30,000 load pulses each or the occurrence of fracture. Data from load to failure and numbers of cycles for fracture were recorded and subjected to Kaplan-Meier and Log Rank tests (P < .05), in addition to Weibull analysis. The fractures of the specimens were classified as repairable or catastrophic. The retreated, filled, and untreated groups presented statistically significantly higher fatigue failure loads and numbers of cycles for failure than did the prepared group. Weibull analysis showed no statistically significant difference among the treatments for characteristic load to failure and characteristic number of cycles for failure, although, for number of cycles, a higher Weibull modulus was observed in filled and retreated conditions. The predominant mode of failure was catastrophic. Teeth subjected to complete endodontic treatment and retreatment behaved similarly in terms of fatigue failure load and number of cycles to failure when compared with untreated teeth. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  18. An efficient scan diagnosis methodology according to scan failure mode for yield enhancement

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok

    2008-12-01

    Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.

  19. Reliability Analysis of the Gradual Degradation of Semiconductor Devices.

    DTIC Science & Technology

    1983-07-20

    under the heading of linear models or linear statistical models . 3 ,4 We have not used this material in this report. Assuming catastrophic failure when...assuming a catastrophic model . In this treatment we first modify our system loss formula and then proceed to the actual analysis. II. ANALYSIS OF...Failure Time 1 Ti Ti 2 T2 T2 n Tn n and are easily analyzed by simple linear regression. Since we have assumed a log normal/Arrhenius activation

  20. Small sample estimation of the reliability function for technical products

    NASA Astrophysics Data System (ADS)

    Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.

    2017-12-01

    It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.

  1. Analysis of Loss-of-Offsite-Power Events 1997-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Nancy Ellen; Schroeder, John Alton

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations weremore » determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.« less

  2. Statistical analysis of lithium iron sulfide status cell cycle life and failure mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, E.C.; Battles, J.E.; Miller, W.E.

    1983-08-01

    A statistical model was developed for life cycle testing of electrochemical cell life cycle trials and verified experimentally. The Weibull distribution was selected to predict the end of life for a cell, based on a 20 percent loss of initial stabilized capacity or a decrease to less than 95 percent coulombic efficiency. Groups of 12 or more Li-alloy/FeS cells were cycled to determine the mean time to failure (MTTF) and also to identify the failure modes. The cells were all full size electric vehicle batteries with 150-350 A-hr capacity. The Weibull shape factors were determined and verified in prediction ofmore » the number of cell failures in two 10 cell modules. The short circuit failure in the cells with BN-felt and MgO powder separators were found to be caused by the formation of Li-Al protrusions that penetrated the BN-felt separators, and the extrusion of active material at the edge of the electrodes.« less

  3. Analysis of Failures of High Speed Shaft Bearing System in a Wind Turbine

    NASA Astrophysics Data System (ADS)

    Wasilczuk, Michał; Gawarkiewicz, Rafał; Bastian, Bartosz

    2018-01-01

    During the operation of wind turbines with gearbox of traditional configuration, consisting of one planetary stage and two helical stages high failure rate of high speed shaft bearings is observed. Such a high failures frequency is not reflected in the results of standard calculations of bearing durability. Most probably it can be attributed to atypical failure mechanism. The authors studied problems in 1.5 MW wind turbines of one of Polish wind farms. The analysis showed that the problems of high failure rate are commonly met all over the world and that the statistics for the analysed turbines were very similar. After the study of potential failure mechanism and its potential reasons, modification of the existing bearing system was proposed. Various options, with different bearing types were investigated. Different versions were examined for: expected durability increase, extent of necessary gearbox modifications and possibility to solve existing problems in operation.

  4. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2006-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  5. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  6. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    PubMed

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  7. SU-F-R-20: Image Texture Features Correlate with Time to Local Failure in Lung SBRT Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, M; Abazeed, M; Woody, N

    Purpose: To explore possible correlation between CT image-based texture and histogram features and time-to-local-failure in early stage non-small cell lung cancer (NSCLC) patients treated with stereotactic body radiotherapy (SBRT).Methods and Materials: From an IRB-approved lung SBRT registry for patients treated between 2009–2013 we selected 48 (20 male, 28 female) patients with local failure. Median patient age was 72.3±10.3 years. Mean time to local failure was 15 ± 7.1 months. Physician-contoured gross tumor volumes (GTV) on the planning CT images were processed and 3D gray-level co-occurrence matrix (GLCM) based texture and histogram features were calculated in Matlab. Data were exported tomore » R and a multiple linear regression model was used to examine the relationship between texture features and time-to-local-failure. Results: Multiple linear regression revealed that entropy (p=0.0233, multiple R2=0.60) from GLCM-based texture analysis and the standard deviation (p=0.0194, multiple R2=0.60) from the histogram-based features were statistically significantly correlated with the time-to-local-failure. Conclusion: Image-based texture analysis can be used to predict certain aspects of treatment outcomes of NSCLC patients treated with SBRT. We found entropy and standard deviation calculated for the GTV on the CT images displayed a statistically significant correlation with and time-to-local-failure in lung SBRT patients.« less

  8. Effectiveness of Quantitative Real Time PCR in Long-Term Follow-up of Chronic Myeloid Leukemia Patients.

    PubMed

    Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin

    2015-08-01

    To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.

  9. Is psychology suffering from a replication crisis? What does "failure to replicate" really mean?

    PubMed

    Maxwell, Scott E; Lau, Michael Y; Howard, George S

    2015-09-01

    Psychology has recently been viewed as facing a replication crisis because efforts to replicate past study findings frequently do not show the same result. Often, the first study showed a statistically significant result but the replication does not. Questions then arise about whether the first study results were false positives, and whether the replication study correctly indicates that there is truly no effect after all. This article suggests these so-called failures to replicate may not be failures at all, but rather are the result of low statistical power in single replication studies, and the result of failure to appreciate the need for multiple replications in order to have enough power to identify true effects. We provide examples of these power problems and suggest some solutions using Bayesian statistics and meta-analysis. Although the need for multiple replication studies may frustrate those who would prefer quick answers to psychology's alleged crisis, the large sample sizes typically needed to provide firm evidence will almost always require concerted efforts from multiple investigators. As a result, it remains to be seen how many of the recently claimed failures to replicate will be supported or instead may turn out to be artifacts of inadequate sample sizes and single study replications. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  10. Strength and life criteria for corrugated fiberboard by three methods

    Treesearch

    Thomas J. Urbanik

    1997-01-01

    The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...

  11. Multivariate analysis of fears in dental phobic patients according to a reduced FSS-II scale.

    PubMed

    Hakeberg, M; Gustafsson, J E; Berggren, U; Carlsson, S G

    1995-10-01

    This study analyzed and assessed dimensions of a questionnaire developed to measure general fears and phobias. A previous factor analysis among 109 dental phobics had revealed a five-factor structure with 22 items and an explained total variance of 54%. The present study analyzed the same material using a multivariate statistical procedure (LISREL) to reveal structural latent variables. The LISREL analysis, based on the correlation matrix, yielded a chi-square of 216.6 with 195 degrees of freedom (P = 0.138) and showed a model with seven latent variables. One was a general fear factor correlated to all 22 items. The other six factors concerned "Illness & Death" (5 items), "Failures & Embarrassment" (5 items), "Social situations" (5 items), "Physical injuries" (4 items), "Animals & Natural phenomena" (4 items). One item (opposite sex) was included in both "Failures & Embarrassment" and "Social situations". The last factor, "Social interaction", combined all the items in "Failures & Embarrassment" and "Social situations" (9 items). In conclusion, this multivariate statistical analysis (LISREL) revealed and confirmed a factor structure similar to our previous study, but added two important dimensions not shown with a traditional factor analysis. This reduced FSS-II version measures general fears and phobias and may be used on a routine clinical basis as well as in dental phobia research.

  12. Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.

    PubMed

    Li, Wen-Chin; Harris, Don; Yu, Chung-San

    2008-03-01

    The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents.

  13. A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components

    NASA Technical Reports Server (NTRS)

    Abernethy, K.

    1986-01-01

    The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.

  14. A method for developing design diagrams for ceramic and glass materials using fatigue data

    NASA Technical Reports Server (NTRS)

    Heslin, T. M.; Magida, M. B.; Forrest, K. A.

    1986-01-01

    The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.

  15. Methodologies for the Statistical Analysis of Memory Response to Radiation

    NASA Astrophysics Data System (ADS)

    Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi

    2016-08-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  16. Does high-flow nasal cannula oxygen improve outcome in acute hypoxemic respiratory failure? A systematic review and meta-analysis.

    PubMed

    Lin, Si-Ming; Liu, Kai-Xiong; Lin, Zhi-Hong; Lin, Pei-Hong

    2017-10-01

    To evaluate the efficacy of high-flow nasal cannula (HFNC) in the rate of intubation and mortality for patients with acute hypoxemic respiratory failure. We searched Pubmed, EMBASE, and the Cochrane Library for relevant studies. Two reviewers extracted data and reviewed the quality of the studies independently. The primary outcome was the rate of intubation; secondary outcome was mortality in the hospital. Study-level data were pooled using a random-effects model when I2 was >50% or a fixed-effects model when I2 was <50%. Eight randomized controlled studies with a total of 1,818patients were considered. Pooled analysis showed that no statistically significant difference was found between groups regarding the rate of intubation (odds ratio [OR] = 0.79; 95% confidence interval [CI]: 0.60-1.04; P = 0.09; I2 = 36%) and no statistically significant difference was found between groups regarding hospital mortality (OR = 0.89; 95% CI: 0.62-127; P = 0.51; I2 = 47%). The use of HFNC showed a trend toward reduction in the intubation rate, which did not meet statistical significance, in patients with acute respiratory failure compared with conventional oxygen therapy (COT) and noninvasive ventilation (NIV). Moreover no difference in mortality. So, Large, well-designed, randomized, multi-center trials are needed to confirm the effects of HFNC in acute hypoxemic respiratory failure patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    PubMed

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  18. Selecting statistical model and optimum maintenance policy: a case study of hydraulic pump.

    PubMed

    Ruhi, S; Karim, M R

    2016-01-01

    Proper maintenance policy can play a vital role for effective investigation of product reliability. Every engineered object such as product, plant or infrastructure needs preventive and corrective maintenance. In this paper we look at a real case study. It deals with the maintenance of hydraulic pumps used in excavators by a mining company. We obtain the data that the owner had collected and carry out an analysis and building models for pump failures. The data consist of both failure and censored lifetimes of the hydraulic pump. Different competitive mixture models are applied to analyze a set of maintenance data of a hydraulic pump. Various characteristics of the mixture models, such as the cumulative distribution function, reliability function, mean time to failure, etc. are estimated to assess the reliability of the pump. Akaike Information Criterion, adjusted Anderson-Darling test statistic, Kolmogrov-Smirnov test statistic and root mean square error are considered to select the suitable models among a set of competitive models. The maximum likelihood estimation method via the EM algorithm is applied mainly for estimating the parameters of the models and reliability related quantities. In this study, it is found that a threefold mixture model (Weibull-Normal-Exponential) fits well for the hydraulic pump failures data set. This paper also illustrates how a suitable statistical model can be applied to estimate the optimum maintenance period at a minimum cost of a hydraulic pump.

  19. PRETREATMENT NUTRITIONAL STATUS AND LOCOREGIONAL FAILURE IN PATIENTS WITH HEAD AND NECK CANCER UNDERGOING DEFINITIVE CONCURRENT CHEMORADIATION THERAPY

    PubMed Central

    Platek, Mary E.; Reid, Mary E.; Wilding, Gregory E.; Jaggernauth, Wainwright; Rigual, Nestor R.; Hicks, Wesley L.; Popat, Saurin R.; Warren, Graham W.; Sullivan, Maureen; Thorstad, Wade L.; Khan, Mohamed K.; Loree, Thom R.; Singh, Anurag K.

    2015-01-01

    Background This study was carried out to determine if markers of nutritional status predict for locoregional failure following intensity-modulated radiation therapy (IMRT) with concurrent chemoradiotherapy (CCRT) for squamous cell carcinoma of the head and neck (SCCHN). Methods We performed a retrospective chart review of 78 patients with SCCHN who received definitive CCRT. We compared patient factors, tumor characteristics, and nutritional status indicators between patients with and without locoregional failure. Results Fifteen of 78 patients (19%) experienced locoregional failure. Median follow-up for live patients was 38 months. On univariate analysis, pretreatment percentage of ideal body weight (%IBW) (p < .01), pretreatment hemoglobin (p = .04), and treatment duration (p < .01) were significant predictors of failure. On multivariate analysis, pretreatment %IBW (p = .04) and treatment time (p < .01) remained statistically significant. Conclusions Although treatment time is an accepted risk factor for failure, differences in outcome for patients with head and neck cancer undergoing definitive CCRT based on pre-treatment %IBW should be examined further. PMID:21990220

  20. Sensor Failure Detection of FASSIP System using Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina

    2018-02-01

    In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.

  1. NiCd cell reliability in the mission environment

    NASA Technical Reports Server (NTRS)

    Denson, William K.; Klein, Glenn C.

    1993-01-01

    This paper summarizes an effort by Gates Aerospace Batteries (GAB) and the Reliability Analysis Center (RAC) to analyze survivability data for both General Electric and GAB NiCd cells utilized in various spacecraft. For simplicity sake, all mission environments are described as either low Earth orbital (LEO) or geosynchronous Earth orbit (GEO). 'Extreme value statistical methods' are applied to this database because of the longevity of the numerous missions while encountering relatively few failures. Every attempt was made to include all known instances of cell-induced-failures of the battery and to exclude battery-induced-failures of the cell. While this distinction may be somewhat limited due to availability of in-flight data, we have accepted the learned opinion of the specific customer contacts to ensure integrity of the common databases. This paper advances the preliminary analysis reported upon at the 1991 NASA Battery Workshop. That prior analysis was concerned with an estimated 278 million cell-hours of operation encompassing 183 satellites. The paper also cited 'no reported failures to date.' This analysis reports on 428 million cell hours of operation emcompassing 212 satellites. This analysis also reports on seven 'cell-induced-failures.'

  2. A Study of Specific Fracture Energy at Percussion Drilling

    NASA Astrophysics Data System (ADS)

    A, Shadrina; T, Kabanova; V, Krets; L, Saruev

    2014-08-01

    The paper presents experimental studies of rock failure provided by percussion drilling. Quantification and qualitative analysis were carried out to estimate critical values of rock failure depending on the hammer pre-impact velocity, types of drill bits and cylindrical hammer parameters (weight, length, diameter), and turn angle of a drill bit. Obtained data in this work were compared with obtained results by other researchers. The particle-size distribution in granite-cutting sludge was analyzed in this paper. Statistical approach (Spearmen's rank-order correlation, multiple regression analysis with dummy variables, Kruskal-Wallis nonparametric test) was used to analyze the drilling process. Experimental data will be useful for specialists engaged in simulation and illustration of rock failure.

  3. Multiresolution Wavelet Analysis of Heartbeat Intervals Discriminates Healthy Patients from Those with Cardiac Pathology

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Feurstein, Markus C.; Teich, Malvin C.

    1998-02-01

    We applied multiresolution wavelet analysis to the sequence of times between human heartbeats ( R-R intervals) and have found a scale window, between 16 and 32 heartbeat intervals, over which the widths of the R-R wavelet coefficients fall into disjoint sets for normal and heart-failure patients. This has enabled us to correctly classify every patient in a standard data set as belonging either to the heart-failure or normal group with 100% accuracy, thereby providing a clinically significant measure of the presence of heart failure from the R-R intervals alone. Comparison is made with previous approaches, which have provided only statistically significant measures.

  4. A retrospective survey of the causes of bracket- and tube-bonding failures.

    PubMed

    Roelofs, Tom; Merkens, Nico; Roelofs, Jeroen; Bronkhorst, Ewald; Breuning, Hero

    2017-01-01

    To investigate the causes of bonding failures of orthodontic brackets and tubes and the effect of premedicating for saliva reduction. Premedication with atropine sulfate was administered randomly. Failure rate of brackets and tubes placed in a group of 158 consecutive patients was evaluated after a mean period of 67 weeks after bonding. The failure rate in the group without atropine sulfate premedication was 2.4%. In the group with premedication, the failure rate was 2.7%. The Cox regression analysis of these groups showed that atropine application did not lead to a reduction in bond failures. Statistically significant differences in the hazard ratio were found for the bracket regions and for the dental assistants who prepared for the bonding procedure. Premedication did not lead to fewer bracket failures. The roles of the dental assistant and patient in preventing failures was relevant. A significantly higher failure rate for orthodontic appliances was found in the posterior regions.

  5. Student failures on first-year medical basic science courses and the USMLE step 1: a retrospective study over a 20-year period.

    PubMed

    Burns, E Robert; Garrett, Judy

    2015-01-01

    Correlates of achievement in the basic science years in medical school and on the Step 1 of the United States Medical Licensing Examination® (USMLE®), (Step 1) in relation to preadmission variables have been the subject of considerable study. Preadmissions variables such as the undergraduate grade point average (uGPA) and Medical College Admission Test® (MCAT®) scores, solely or in combination, have previously been found to be predictors of achievement in the basic science years and/or on the Step 1. The purposes of this retrospective study were to: (1) determine if our statistical analysis confirmed previously published relationships between preadmission variables (MCAT, uGPA, and applicant pool size), and (2) study correlates of the number of failures in five M1 courses with those preadmission variables and failures on Step 1. Statistical analysis confirmed previously published relationships between all preadmission variables. Only one course, Microscopic Anatomy, demonstrated significant correlations with all variables studied including the Step 1 failures. Physiology correlated with three of the four variables studied, but not with the Step 1 failures. Analyses such as these provide a tool by which administrators will be able to identify what courses are or are not responding in appropriate ways to changes in the preadmissions variables that signal student performance on the Step 1. © 2014 American Association of Anatomists.

  6. An empirical comparison of statistical tests for assessing the proportional hazards assumption of Cox's model.

    PubMed

    Ng'andu, N H

    1997-03-30

    In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.

  7. Influence of bisphosphonates on alveolar bone loss around osseointegrated implants.

    PubMed

    Zahid, Talal M; Wang, Bing-Yan; Cohen, Robert E

    2011-06-01

    The relationship between bisphosphonates (BP) and dental implant failure has not been fully elucidated. The purpose of this retrospective radiographic study was to examine whether patients who take BP are at greater risk of implant failure than patients not using those agents. Treatment records of 362 consecutively treated patients receiving endosseous dental implants were reviewed. The patient population consisted of 227 women and 135 men with a mean age of 56 years (range: 17-87 years), treated in the University at Buffalo Postgraduate Clinic from 1997-2008. Demographic information collected included age, gender, smoking status, as well as systemic conditions and medication use. Implant characteristics reviewed included system, date of placement, date of follow-up radiographs, surgical complications, number of exposed threads, and implant failure. The relationship between BP and implant failure was analyzed using generalized estimating equation (GEE) analysis. Twenty-six patients using BP received a total of 51 dental implants. Three implants failed, yielding success rates of 94.11% and 88.46% for the implant-based and subject-based analyses, respectively. Using the GEE statistical method we found a statistically significant (P  =  .001; OR  =  3.25) association between the use of BP and implant thread exposure. None of the other variables studied were statistically associated with implant failure or thread exposure. In conclusion, patients taking BP may be at higher risk for implant thread exposure.

  8. DEPEND - A design environment for prediction and evaluation of system dependability

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.; Iyer, Ravishankar K.

    1990-01-01

    The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.

  9. Argonne National Laboratory Li-alloy/FeS cell testing and R and D programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, E.C.

    1982-01-01

    Groups of 12 or more identical Li-alloy/FeS cells fabricated by Eagle-Picher Industries, Inc. and Gould Inc. were operated at Argonne National Laboratory (ANL) in the status cell test program to obtain data for statistical analysis of cell cycle life and failure modes. The cells were full-size electric vehicle battery cells (150 to 350 Ah capacity) and they were cycled at the 4-h discharge rate and 8-h charge rate. The end of life was defined as a 20% loss of capacity or a decrease in the coulombic efficiency to less than 95%. Seventy-four cells (six groups of identical cells) were cycle-lifemore » tested and the results were analyzed statistically. The ultimate goal of this analysis was to predict cell and battery reliability. Testing of groups of identical cells also provided a means of identifying common failure modes which were eliminated by cell design changes. Mean time to failure (MTTF) for the cells based on the Weibull distribution is presented.« less

  10. Detonation Failure Thickness Measurement in AN Annular Geometry

    NASA Astrophysics Data System (ADS)

    Mack, D. B.; Petel, O. E.; Higgins, A. J.

    2007-12-01

    The failure thickness of neat nitromethane in aluminum confinement was measured using a novel experimental technique. The thickness was approximated in an annular geometry by the gap between a concentric aluminum tube and rod. This technique was motivated by the desire to have a periodic boundary condition in the direction orthogonal to the annulus thickness, rather than a free surface occurring in typical rectangular geometry experiments. This results in a two-dimensional charge analogous to previous failure thickness setups but with infinite effective width (i.e. infinite aspect ratio). Detonation propagation or failure was determined by the observation of failure patterns engraved on the aluminum rod by the passing detonation. Analysis of these engraved patterns provides a statistical measurement of the spatial density of failure waves. Failure was observed as far as 180 thicknesses downstream. The failure thickness was measured to be 1.45 mm±0.15 mm.

  11. Analysis and comparison of the biomechanical properties of univalved and bivalved cast models.

    PubMed

    Crickard, Colin V; Riccio, Anthony I; Carney, Joseph R; Anderson, Terrence D

    2011-01-01

    Fiberglass casts are frequently valved to relieve the pressure associated with upper extremity swelling after a surgical procedure or when applied after reduction of a displaced fracture in a child. Although different opinions exist regarding the valving of casts, no research to date has explored the biomechanical effects of this commonly used technique. As cast integrity is essential for the maintenance of fracture reduction, it is important to understand whether casts are structurally compromised after valving. Understanding the effects of valving on cast integrity may help guide clinicians in the technique of valving while minimizing the potential for a loss of fracture reduction. Thirty standardized cylindrical fiberglass cast models were created. Ten models were left intact, 10 were univalved, and 10 were bivalved. All the models were mechanically tested by a 3-point bending apparatus secured to a biaxial materials testing system. Load to failure and bending stiffness were recorded for each sample. Differences in load of failure and bending stiffness were compared among the groups. Unvalved cast models had the highest failure load and bending stiffness, whereas bivalved casts showed the lowest value for both failure load and bending stiffness. Univalved casts had a failure load measured to be between those of unvalved and bivalved cast models. Analysis of variance showed significance when failure load and bending stiffness data among all the groups were compared. A post hoc Bonferroni statistical analysis showed significance in bending stiffness between intact and bivalved models (P < 0.01), intact and univalved models (P < 0.01), but no significant difference in bending stiffness between univalved and bivalved models (P > 0.01). Differences in measured failure load values were found to be statistically significant among all cast models (P < 0.01). Valving significantly decreases the bending stiffness and load to failure of fiberglass casts. Univalved casts have a higher load to failure than bivalved casts. Valving adversely alters the structural integrity of fiberglass casts. This may impair a cast's ability to effectively immobilize an extremity or maintain a fracture reduction.

  12. Statistical Models and Inference Procedures for Structural and Materials Reliability

    DTIC Science & Technology

    1990-12-01

    as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of

  13. A statistical model of operational impacts on the framework of the bridge crane

    NASA Astrophysics Data System (ADS)

    Antsev, V. Yu; Tolokonnikov, A. S.; Gorynin, A. D.; Reutov, A. A.

    2017-02-01

    The technical regulations of the Customs Union demands implementation of the risk analysis of the bridge cranes operation at their design stage. The statistical model has been developed for performance of random calculations of risks, allowing us to model possible operational influences on the bridge crane metal structure in their various combination. The statistical model is practically actualized in the software product automated calculation of risks of failure occurrence of bridge cranes.

  14. Analysis of risk factors for cluster behavior of dental implant failures.

    PubMed

    Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    2017-08-01

    Some studies indicated that implant failures are commonly concentrated in few patients. To identify and analyze cluster behavior of dental implant failures among subjects of a retrospective study. This retrospective study included patients receiving at least three implants only. Patients presenting at least three implant failures were classified as presenting a cluster behavior. Univariate and multivariate logistic regression models and generalized estimating equations analysis evaluated the effect of explanatory variables on the cluster behavior. There were 1406 patients with three or more implants (8337 implants, 592 failures). Sixty-seven (4.77%) patients presented cluster behavior, with 56.8% of all implant failures. The intake of antidepressants and bruxism were identified as potential negative factors exerting a statistically significant influence on a cluster behavior at the patient-level. The negative factors at the implant-level were turned implants, short implants, poor bone quality, age of the patient, the intake of medicaments to reduce the acid gastric production, smoking, and bruxism. A cluster pattern among patients with implant failure is highly probable. Factors of interest as predictors for implant failures could be a number of systemic and local factors, although a direct causal relationship cannot be ascertained. © 2017 Wiley Periodicals, Inc.

  15. A novel risk assessment method for landfill slope failure: Case study application for Bhalswa Dumpsite, India.

    PubMed

    Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh

    2017-03-01

    Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.

  16. Multinational Assessment of Accuracy of Equations for Predicting Risk of Kidney Failure: A Meta-analysis.

    PubMed

    Tangri, Navdeep; Grams, Morgan E; Levey, Andrew S; Coresh, Josef; Appel, Lawrence J; Astor, Brad C; Chodick, Gabriel; Collins, Allan J; Djurdjev, Ognjenka; Elley, C Raina; Evans, Marie; Garg, Amit X; Hallan, Stein I; Inker, Lesley A; Ito, Sadayoshi; Jee, Sun Ha; Kovesdy, Csaba P; Kronenberg, Florian; Heerspink, Hiddo J Lambers; Marks, Angharad; Nadkarni, Girish N; Navaneethan, Sankar D; Nelson, Robert G; Titze, Stephanie; Sarnak, Mark J; Stengel, Benedicte; Woodward, Mark; Iseki, Kunitoshi

    2016-01-12

    Identifying patients at risk of chronic kidney disease (CKD) progression may facilitate more optimal nephrology care. Kidney failure risk equations, including such factors as age, sex, estimated glomerular filtration rate, and calcium and phosphate concentrations, were previously developed and validated in 2 Canadian cohorts. Validation in other regions and in CKD populations not under the care of a nephrologist is needed. To evaluate the accuracy of the risk equations across different geographic regions and patient populations through individual participant data meta-analysis. Thirty-one cohorts, including 721,357 participants with CKD stages 3 to 5 in more than 30 countries spanning 4 continents, were studied. These cohorts collected data from 1982 through 2014. Cohorts participating in the CKD Prognosis Consortium with data on end-stage renal disease. Data were obtained and statistical analyses were performed between July 2012 and June 2015. Using the risk factors from the original risk equations, cohort-specific hazard ratios were estimated and combined using random-effects meta-analysis to form new pooled kidney failure risk equations. Original and pooled kidney failure risk equation performance was compared, and the need for regional calibration factors was assessed. Kidney failure (treatment by dialysis or kidney transplant). During a median follow-up of 4 years of 721,357 participants with CKD, 23,829 cases kidney failure were observed. The original risk equations achieved excellent discrimination (ability to differentiate those who developed kidney failure from those who did not) across all cohorts (overall C statistic, 0.90; 95% CI, 0.89-0.92 at 2 years; C statistic at 5 years, 0.88; 95% CI, 0.86-0.90); discrimination in subgroups by age, race, and diabetes status was similar. There was no improvement with the pooled equations. Calibration (the difference between observed and predicted risk) was adequate in North American cohorts, but the original risk equations overestimated risk in some non-North American cohorts. Addition of a calibration factor that lowered the baseline risk by 32.9% at 2 years and 16.5% at 5 years improved the calibration in 12 of 15 and 10 of 13 non-North American cohorts at 2 and 5 years, respectively (P = .04 and P = .02). Kidney failure risk equations developed in a Canadian population showed high discrimination and adequate calibration when validated in 31 multinational cohorts. However, in some regions the addition of a calibration factor may be necessary.

  17. Conservative Allowables Determined by a Tsai-Hill Equivalent Criterion for Design of Satellite Composite Parts

    NASA Astrophysics Data System (ADS)

    Pommatau, Gilles

    2014-06-01

    The present paper deals with the industrial application, via a software developed by Thales Alenia Space, of a new failure criterion named "Tsai-Hill equivalent criterion" for composite structural parts of satellites. The first part of the paper briefly describes the main hypothesis and the possibilities in terms of failure analysis of the software. The second parts reminds the quadratic and conservative nature of the new failure criterion, already presented in ESA conference in a previous paper. The third part presents the statistical calculation possibilities of the software, and the associated sensitivity analysis, via results obtained on different composites. Then a methodology, proposed to customers and agencies, is presented with its limitations and advantages. It is then conclude that this methodology is an efficient industrial way to perform mechanical analysis on quasi-isotropic composite parts.

  18. The vulnerability of electric equipment to carbon fibers of mixed lengths: An analysis

    NASA Technical Reports Server (NTRS)

    Elber, W.

    1980-01-01

    The susceptibility of a stereo amplifier to damage from a spectrum of lengths of graphite fibers was calculated. A simple analysis was developed by which such calculations can be based on test results with fibers of uniform lengths. A statistical analysis was applied for the conversation of data for various logical failure criteria.

  19. Statistical forecasting of repetitious dome failures during the waning eruption of Redoubt Volcano, Alaska, February-April 1990

    USGS Publications Warehouse

    Page, R.A.; Lahr, J.C.; Chouet, B.A.; Power, J.A.; Stephens, C.D.

    1994-01-01

    The waning phase of the 1989-1990 eruption of Redoubt Volcano in the Cook Inlet region of south-central Alaska comprised a quasi-regular pattern of repetitious dome growth and destruction that lasted from February 15 to late April 1990. The dome failures produced ash plumes hazardous to airline traffic. In response to this hazard, the Alaska Volcano Observatory sought to forecast these ash-producing events using two approaches. One approach built on early successes in issuing warnings before major eruptions on December 14, 1989 and January 2, 1990. These warnings were based largely on changes in seismic activity related to the occurrence of precursory swarms of long-period seismic events. The search for precursory swarms of long-period seismicity was continued through the waning phase of the eruption and led to warnings before tephra eruptions on March 23 and April 6. The observed regularity of dome failures after February 15 suggested that a statistical forecasting method based on a constant-rate failure model might also be successful. The first statistical forecast was issued on March 16 after seven events had occurred, at an average interval of 4.5 days. At this time, the interval between dome failures abruptly lengthened. Accordingly, the forecast was unsuccessful and further forecasting was suspended until the regularity of subsequent failures could be confirmed. Statistical forecasting resumed on April 12, after four dome failure episodes separated by an average of 7.8 days. One dome failure (April 15) was successfully forecast using a 70% confidence window, and a second event (April 21) was narrowly missed before the end of the activity. The cessation of dome failures after April 21 resulted in a concluding false alarm. Although forecasting success during the eruption was limited, retrospective analysis shows that early and consistent application of the statistical method using a constant-rate failure model and a 90% confidence window could have yielded five successful forecasts and two false alarms; no events would have been missed. On closer examination, the intervals between successive dome failures are not uniform but tend to increase with time. This increase attests to the continuous, slowly decreasing supply of magma to the surface vent during the waning phase of the eruption. The domes formed in a precarious position in a breach in the summit crater rim where they were susceptible to gravitational collapse. The instability of the February 15-April 21 domes relative to the earlier domes is attributed to reaming the lip of the vent by a laterally directed explosion during the major dome-destroying eruption of February 15, a process which would leave a less secure foundation for subsequent domes. ?? 1994.

  20. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  1. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  2. Study of the Rock Mass Failure Process and Mechanisms During the Transformation from Open-Pit to Underground Mining Based on Microseismic Monitoring

    NASA Astrophysics Data System (ADS)

    Zhao, Yong; Yang, Tianhong; Bohnhoff, Marco; Zhang, Penghai; Yu, Qinglei; Zhou, Jingren; Liu, Feiyue

    2018-05-01

    To quantitatively understand the failure process and failure mechanism of a rock mass during the transformation from open-pit mining to underground mining, the Shirengou Iron Mine was selected as an engineering project case study. The study area was determined using the rock mass basic quality classification method and the kinematic analysis method. Based on the analysis of the variations in apparent stress and apparent volume over time, the rock mass failure process was analyzed. According to the recent research on the temporal and spatial change of microseismic events in location, energy, apparent stress, and displacement, the migration characteristics of rock mass damage were studied. A hybrid moment tensor inversion method was used to determine the rock mass fracture source mechanisms, the fracture orientations, and fracture scales. The fracture area can be divided into three zones: Zone A, Zone B, and Zone C. A statistical analysis of the orientation information of the fracture planes orientations was carried out, and four dominant fracture planes were obtained. Finally, the slip tendency analysis method was employed, and the unstable fracture planes were obtained. The results show: (1) The microseismic monitoring and hybrid moment tensor analysis can effectively analyze the failure process and failure mechanism of rock mass, (2) during the transformation from open-pit to underground mining, the failure type of rock mass is mainly shear failure and the tensile failure is mostly concentrated in the roof of goafs, and (3) the rock mass of the pit bottom and the upper of goaf No. 18 have the possibility of further damage.

  3. A Multiscale Progressive Failure Modeling Methodology for Composites that Includes Fiber Strength Stochastics

    NASA Technical Reports Server (NTRS)

    Ricks, Trenton M.; Lacy, Thomas E., Jr.; Bednarcyk, Brett A.; Arnold, Steven M.; Hutchins, John W.

    2014-01-01

    A multiscale modeling methodology was developed for continuous fiber composites that incorporates a statistical distribution of fiber strengths into coupled multiscale micromechanics/finite element (FE) analyses. A modified two-parameter Weibull cumulative distribution function, which accounts for the effect of fiber length on the probability of failure, was used to characterize the statistical distribution of fiber strengths. A parametric study using the NASA Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) was performed to assess the effect of variable fiber strengths on local composite failure within a repeating unit cell (RUC) and subsequent global failure. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a unidirectional SCS-6/TIMETAL 21S metal matrix composite tensile dogbone specimen at 650 degC. Multiscale progressive failure analyses were performed to quantify the effect of spatially varying fiber strengths on the RUC-averaged and global stress-strain responses and failure. The ultimate composite strengths and distribution of failure locations (predominately within the gage section) reasonably matched the experimentally observed failure behavior. The predicted composite failure behavior suggests that use of macroscale models that exploit global geometric symmetries are inappropriate for cases where the actual distribution of local fiber strengths displays no such symmetries. This issue has not received much attention in the literature. Moreover, the model discretization at a specific length scale can have a profound effect on the computational costs associated with multiscale simulations.models that yield accurate yet tractable results.

  4. Failure statistics for commercial lithium ion batteries: A study of 24 pouch cells

    NASA Astrophysics Data System (ADS)

    Harris, Stephen J.; Harris, David J.; Li, Chen

    2017-02-01

    There are relatively few publications that assess capacity decline in enough commercial cells to quantify cell-to-cell variation, but those that do show a surprisingly wide variability. Capacity curves cross each other often, a challenge for efforts to measure the state of health and predict the remaining useful life (RUL) of individual cells. We analyze capacity fade statistics for 24 commercial pouch cells, providing an estimate for the time to 5% failure. Our data indicate that RUL predictions based on remaining capacity or internal resistance are accurate only once the cells have already sorted themselves into "better" and "worse" ones. Analysis of our failure data, using maximum likelihood techniques, provide uniformly good fits for a variety of definitions of failure with normal and with 2- and 3-parameter Weibull probability density functions, but we argue against using a 3-parameter Weibull function for our data. pdf fitting parameters appear to converge after about 15 failures, although business objectives should ultimately determine whether data from a given number of batteries provides sufficient confidence to end lifecycle testing. Increased efforts to make batteries with more consistent lifetimes should lead to improvements in battery cost and safety.

  5. Three-year clinical follow-up of posterior teeth restored with leucite-reinforced ips empress onlays and partial veneer crowns.

    PubMed

    Murgueitio, Rafael; Bernal, Guillermo

    2012-07-01

    The aim of this study was to analyze the survival rate and failure mode of IPS leucite-reinforced ceramic onlays and partial veneer crowns regarding thickness under the following clinical conditions: vital versus nonvital teeth, tooth location, and type of opposing dentition. Teeth were prepared according to established guidelines for ceramic onlays and partial veneer crowns. Before cementation, the restorations were measured for occlusal thickness at the central fossa, mesial, and distal marginal ridges, and functional and nonfunctional cusps. A total of 210 ceramic restorations were cemented in 99 patients within a mean observation period of 2.9 ± 1.89 years. The mode of failure was classified and evaluated as (1) adhesive, (2) cohesive, (3) combined failure, (4) decementation, (5) tooth sensitivity, and (6) pulpal necrosis. Kaplan, log-rank, and Cox regression tests were used for statistical analysis. The failure rate was 3.33% (7/210). Increased material thickness produced less probability of failures. Vital teeth were less likely to fail than nonvital teeth. Second molars were five times more susceptible to failure than first molars. Tooth sensitivity postcementation and the type of opposing dentition were not statistically significant in this study. In this study, thickness of the restorations, tooth vitality, and location of teeth in the dental arch influenced restoration failures. © 2012 by the American College of Prosthodontists.

  6. Statistical Analysis on the Mechanical Properties of Magnesium Alloys

    PubMed Central

    Liu, Ruoyu; Jiang, Xianquan; Zhang, Hongju; Zhang, Dingfei; Wang, Jingfeng; Pan, Fusheng

    2017-01-01

    Knowledge of statistical characteristics of mechanical properties is very important for the practical application of structural materials. Unfortunately, the scatter characteristics of magnesium alloys for mechanical performance remain poorly understood until now. In this study, the mechanical reliability of magnesium alloys is systematically estimated using Weibull statistical analysis. Interestingly, the Weibull modulus, m, of strength for magnesium alloys is as high as that for aluminum and steels, confirming the very high reliability of magnesium alloys. The high predictability in the tensile strength of magnesium alloys represents the capability of preventing catastrophic premature failure during service, which is essential for safety and reliability assessment. PMID:29113116

  7. A comprehensive analysis of the performance characteristics of the Mount Laguna solar photovoltaic installation

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Sollock, S. G.

    1981-01-01

    This paper represents the first comprehensive survey of the Mount Laguna Photovoltaic Installation. The novel techniques used for performing the field tests have been effective in locating and characterizing defective modules. A comparative analysis on the two types of modules used in the array indicates that they have significantly different failure rates, different distributions in degradational space and very different failure modes. A life cycle model is presented to explain a multimodal distribution observed for one module type. A statistical model is constructed and it is shown to be in good agreement with the field data.

  8. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  9. Statistics of acoustic emissions and stress drops during granular shearing using a stick-slip fiber bundle mode

    NASA Astrophysics Data System (ADS)

    Cohen, D.; Michlmayr, G.; Or, D.

    2012-04-01

    Shearing of dense granular materials appears in many engineering and Earth sciences applications. Under a constant strain rate, the shearing stress at steady state oscillates with slow rises followed by rapid drops that are linked to the build up and failure of force chains. Experiments indicate that these drops display exponential statistics. Measurements of acoustic emissions during shearing indicates that the energy liberated by failure of these force chains has power-law statistics. Representing force chains as fibers, we use a stick-slip fiber bundle model to obtain analytical solutions of the statistical distribution of stress drops and failure energy. In the model, fibers stretch, fail, and regain strength during deformation. Fibers have Weibull-distributed threshold strengths with either quenched and annealed disorder. The shape of the distribution for drops and energy obtained from the model are similar to those measured during shearing experiments. This simple model may be useful to identify failure events linked to force chain failures. Future generalizations of the model that include different types of fiber failure may also allow identification of different types of granular failures that have distinct statistical acoustic emission signatures.

  10. Development and evaluation of a composite risk score to predict kidney transplant failure.

    PubMed

    Moore, Jason; He, Xiang; Shabir, Shazia; Hanvesakul, Rajesh; Benavente, David; Cockwell, Paul; Little, Mark A; Ball, Simon; Inston, Nicholas; Johnston, Atholl; Borrows, Richard

    2011-05-01

    Although risk factors for kidney transplant failure are well described, prognostic risk scores to estimate risk in prevalent transplant recipients are limited. Development and validation of risk-prediction instruments. The development data set included 2,763 prevalent patients more than 12 months posttransplant enrolled into the LOTESS (Long Term Efficacy and Safety Surveillance) Study. The validation data set included 731 patients who underwent transplant at a single UK center. Estimated glomerular filtration rate (eGFR) and other risk factors were evaluated using Cox regression. Scores for death-censored and overall transplant failure were based on the summed hazard ratios for baseline predictor variables. Predictive performance was assessed using calibration (Hosmer-Lemeshow statistic), discrimination (C statistic), and clinical reclassification (net reclassification improvement) compared with eGFR alone. In the development data set, 196 patients died and another 225 experienced transplant failure. eGFR, recipient age, race, serum urea and albumin levels, declining eGFR, and prior acute rejection predicted death-censored transplant failure. eGFR, recipient age, sex, serum urea and albumin levels, and declining eGFR predicted overall transplant failure. In the validation data set, 44 patients died and another 101 experienced transplant failure. The weighted scores comprising these variables showed adequate discrimination and calibration for death-censored (C statistic, 0.83; 95% CI, 0.75-0.91; Hosmer-Lemeshow χ(2)P = 0.8) and overall (C statistic, 0.70; 95% CI, 0.64-0.77; Hosmer-Lemeshow χ(2)P = 0.5) transplant failure. However, the scores failed to reclassify risk compared with eGFR alone (net reclassification improvements of 7.6% [95% CI, -0.2 to 13.4; P = 0.09] and 4.3% [95% CI, -2.7 to 11.8; P = 0.3] for death-censored and overall transplant failure, respectively). Retrospective analysis of predominantly cyclosporine-treated patients; limited study size and categorization of variables may limit power to detect effect. Although the scores performed well regarding discrimination and calibration, clinically relevant risk reclassification over eGFR alone was not evident, emphasizing the stringent requirements for such scores. Further studies are required to develop and refine this process. Copyright © 2011 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  11. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  12. Characterization of the behavior of three definitions of prostate-specific antigen-based biochemical failure in relation to detection and follow-up biases: comparison with the American Society for Therapeutic Radiology and Oncology consensus definition.

    PubMed

    Williams, Scott G

    2006-03-01

    To examine the impact of detection biases on three prostate cancer biochemical failure (bF) definitions in comparison with the existing American Society for Therapeutic Radiology and Oncology Consensus Definition (ACD). Three alternative bF definitions were tested against the ACD: three rises in prostate-specific antigen (PSA) level without backdating, nadir plus 2 ng/mL, and a threshold PSA level of >3 ng/mL, according to data from 1050 men. The mean time between PSA tests (MTBT), regularity of collection, and calendar year of analysis were examined in each bF definition. The MTBT produced a statistically significant difference in the derived hazard ratio for identification of bF in all definitions. The influence of test regularity was statistically significant beyond the median level of regularity in all definitions. The year of analysis impacted greatly on the ACD, whereas the three alternative definitions exhibited minor follow-up duration variations by comparison. The alternative definitions had reliable follow-up when the crude median time to censoring was at least 1.6 times greater than that of failure. Detection biases will always be a significant issue in defining bF. A number of alternative failure definitions have more predictable interactions with these biases than the existing ACD.

  13. How Miniature/Microminiature (2M) Repair Capabilities Can Reduce the Impact of No Evidence of Failure (NEOF) Among Repairables on the Navy’s Operations and Maintenance Account

    DTIC Science & Technology

    1988-06-01

    and PCBs. The pilot program involved screening, testing , and repairing of EMs/PCBs for both COMNAVSEASYSCOM and Commander, Naval Electronic Systems...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests rformed by"IMA San Diego duringl987. A statistical analysis and a Level...were chosen from the Support and Test Equipment Engineering Program (STEEP) tests performed by SIMA San Diego during 1987. A statistical analysis and a

  14. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  15. Is it possible to identify a trend in problem/failure data

    NASA Technical Reports Server (NTRS)

    Church, Curtis K.

    1990-01-01

    One of the major obstacles in identifying and interpreting a trend is the small number of data points. Future trending reports will begin with 1983 data. As the problem/failure data are aggregated by year, there are just seven observations (1983 to 1989) for the 1990 reports. Any statistical inferences with a small amount of data will have a large degree of uncertainty. Consequently, a regression technique approach to identify a trend is limited. Though trend determination by failure mode may be unrealistic, the data may be explored for consistency or stability and the failure rate investigated. Various alternative data analysis procedures are briefly discussed. Techniques that could be used to explore problem/failure data by failure mode are addressed. The data used are taken from Section One, Space Shuttle Main Engine, of the Calspan Quarterly Report dated April 2, 1990.

  16. Potential surrogate endpoints for prostate cancer survival: analysis of a phase III randomized trial.

    PubMed

    Ray, Michael E; Bae, Kyounghwa; Hussain, Maha H A; Hanks, Gerald E; Shipley, William U; Sandler, Howard M

    2009-02-18

    The identification of surrogate endpoints for prostate cancer-specific survival may shorten the length of clinical trials for prostate cancer. We evaluated distant metastasis and general clinical treatment failure as potential surrogates for prostate cancer-specific survival by use of data from the Radiation Therapy and Oncology Group 92-02 randomized trial. Patients (n = 1554 randomly assigned and 1521 evaluable for this analysis) with locally advanced prostate cancer had been treated with 4 months of neoadjuvant and concurrent androgen deprivation therapy with external beam radiation therapy and then randomly assigned to no additional therapy (control arm) or 24 additional months of androgen deprivation therapy (experimental arm). Data from landmark analyses at 3 and 5 years for general clinical treatment failure (defined as documented local disease progression, regional or distant metastasis, initiation of androgen deprivation therapy, or a prostate-specific antigen level of 25 ng/mL or higher after radiation therapy) and/or distant metastasis were tested as surrogate endpoints for prostate cancer-specific survival at 10 years by use of Prentice's four criteria. All statistical tests were two-sided. At 3 years, 1364 patients were alive and contributed data for analysis. Both distant metastasis and general clinical treatment failure at 3 years were consistent with all four of Prentice's criteria for being surrogate endpoints for prostate cancer-specific survival at 10 years. At 5 years, 1178 patients were alive and contributed data for analysis. Although prostate cancer-specific survival was not statistically significantly different between treatment arms at 5 years (P = .08), both endpoints were consistent with Prentice's remaining criteria. Distant metastasis and general clinical treatment failure at 3 years may be candidate surrogate endpoints for prostate cancer-specific survival at 10 years. These endpoints, however, must be validated in other datasets.

  17. A new casemix adjustment index for hospital mortality among patients with congestive heart failure.

    PubMed

    Polanczyk, C A; Rohde, L E; Philbin, E A; Di Salvo, T G

    1998-10-01

    Comparative analysis of hospital outcomes requires reliable adjustment for casemix. Although congestive heart failure is one of the most common indications for hospitalization, congestive heart failure casemix adjustment has not been widely studied. The purposes of this study were (1) to describe and validate a new congestive heart failure-specific casemix adjustment index to predict in-hospital mortality and (2) to compare its performance to the Charlson comorbidity index. Data from all 4,608 admissions to the Massachusetts General Hospital from January 1990 to July 1996 with a principal ICD-9-CM discharge diagnosis of congestive heart failure were evaluated. Massachusetts General Hospital patients were randomly divided in a derivation and a validation set. By logistic regression, odds ratios for in-hospital death were computed and weights were assigned to construct a new predictive index in the derivation set. The performance of the index was tested in an internal Massachusetts General Hospital validation set and in a non-Massachusetts General Hospital external validation set incorporating data from all 1995 New York state hospital discharges with a primary discharge diagnosis of congestive heart failure. Overall in-hospital mortality was 6.4%. Based on the new index, patients were assigned to six categories with incrementally increasing hospital mortality rates ranging from 0.5% to 31%. By logistic regression, "c" statistics of the congestive heart failure-specific index (0.83 and 0.78, derivation and validation set) were significantly superior to the Charlson index (0.66). Similar incrementally increasing hospital mortality rates were observed in the New York database with the congestive heart failure-specific index ("c" statistics 0.75). In an administrative database, this congestive heart failure-specific index may be a more adequate casemix adjustment tool to predict hospital mortality in patients hospitalized for congestive heart failure.

  18. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  19. A review of failure models for unidirectional ceramic matrix composites under monotonic loads

    NASA Technical Reports Server (NTRS)

    Tripp, David E.; Hemann, John H.; Gyekenyesi, John P.

    1989-01-01

    Ceramic matrix composites offer significant potential for improving the performance of turbine engines. In order to achieve their potential, however, improvements in design methodology are needed. In the past most components using structural ceramic matrix composites were designed by trial and error since the emphasis of feasibility demonstration minimized the development of mathematical models. To understand the key parameters controlling response and the mechanics of failure, the development of structural failure models is required. A review of short term failure models with potential for ceramic matrix composite laminates under monotonic loads is presented. Phenomenological, semi-empirical, shear-lag, fracture mechanics, damage mechanics, and statistical models for the fast fracture analysis of continuous fiber unidirectional ceramic matrix composites under monotonic loads are surveyed.

  20. ANN based Performance Evaluation of BDI for Condition Monitoring of Induction Motor Bearings

    NASA Astrophysics Data System (ADS)

    Patel, Raj Kumar; Giri, V. K.

    2017-06-01

    One of the critical parts in rotating machines is bearings and most of the failure arises from the defective bearings. Bearing failure leads to failure of a machine and the unpredicted productivity loss in the performance. Therefore, bearing fault detection and prognosis is an integral part of the preventive maintenance procedures. In this paper vibration signal for four conditions of a deep groove ball bearing; normal (N), inner race defect (IRD), ball defect (BD) and outer race defect (ORD) were acquired from a customized bearing test rig, under four different conditions and three different fault sizes. Two approaches have been opted for statistical feature extraction from the vibration signal. In the first approach, raw signal is used for statistical feature extraction and in the second approach statistical features extracted are based on bearing damage index (BDI). The proposed BDI technique uses wavelet packet node energy coefficients analysis method. Both the features are used as inputs to an ANN classifier to evaluate its performance. A comparison of ANN performance is made based on raw vibration data and data chosen by using BDI. The ANN performance has been found to be fairly higher when BDI based signals were used as inputs to the classifier.

  1. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  2. A Survey of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, Chris; Holloway, C. M.

    2003-01-01

    Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.

  3. DATMAN: A reliability data analysis program using Bayesian updating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M.; Feltus, M.A.

    1996-12-31

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less

  4. Sensory redundancy management: The development of a design methodology for determining threshold values through a statistical analysis of sensor output data

    NASA Technical Reports Server (NTRS)

    Scalzo, F.

    1983-01-01

    Sensor redundancy management (SRM) requires a system which will detect failures and reconstruct avionics accordingly. A probability density function to determine false alarm rates, using an algorithmic approach was generated. Microcomputer software was developed which will print out tables of values for the cummulative probability of being in the domain of failure; system reliability; and false alarm probability, given a signal is in the domain of failure. The microcomputer software was applied to the sensor output data for various AFT1 F-16 flights and sensor parameters. Practical recommendations for further research were made.

  5. Real-time forecasting and predictability of catastrophic failure events: from rock failure to volcanoes and earthquakes

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Bell, A. F.; Naylor, M.; Atkinson, M.; Filguera, R.; Meredith, P. G.; Brantut, N.

    2012-12-01

    Accurate prediction of catastrophic brittle failure in rocks and in the Earth presents a significant challenge on theoretical and practical grounds. The governing equations are not known precisely, but are known to produce highly non-linear behavior similar to those of near-critical dynamical systems, with a large and irreducible stochastic component due to material heterogeneity. In a laboratory setting mechanical, hydraulic and rock physical properties are known to change in systematic ways prior to catastrophic failure, often with significant non-Gaussian fluctuations about the mean signal at a given time, for example in the rate of remotely-sensed acoustic emissions. The effectiveness of such signals in real-time forecasting has never been tested before in a controlled laboratory setting, and previous work has often been qualitative in nature, and subject to retrospective selection bias, though it has often been invoked as a basis in forecasting natural hazard events such as volcanoes and earthquakes. Here we describe a collaborative experiment in real-time data assimilation to explore the limits of predictability of rock failure in a best-case scenario. Data are streamed from a remote rock deformation laboratory to a user-friendly portal, where several proposed physical/stochastic models can be analysed in parallel in real time, using a variety of statistical fitting techniques, including least squares regression, maximum likelihood fitting, Markov-chain Monte-Carlo and Bayesian analysis. The results are posted and regularly updated on the web site prior to catastrophic failure, to ensure a true and and verifiable prospective test of forecasting power. Preliminary tests on synthetic data with known non-Gaussian statistics shows how forecasting power is likely to evolve in the live experiments. In general the predicted failure time does converge on the real failure time, illustrating the bias associated with the 'benefit of hindsight' in retrospective analyses. Inference techniques that account explicitly for non-Gaussian statistics significantly reduce the bias, and increase the reliability and accuracy, of the forecast failure time in prospective mode.

  6. Methods for trend analysis: Examples with problem/failure data

    NASA Technical Reports Server (NTRS)

    Church, Curtis K.

    1989-01-01

    Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.

  7. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly performed by two Masters Students, Sravanthi Boppana and Vidyashree Rajasekar. This thesis presents the indoor soiling study, whereas the other thesis presents the outdoor soiling study. Similarly, the statistical risk analyses of two power plants (model J and model JVA) were jointly performed by these two Masters students. Both power plants are located at the same cold-dry climate, but one power plant carries framed modules and the other carries frameless modules. This thesis presents the results obtained on the frameless modules.

  8. An Analysis of Operational Suitability for Test and Evaluation of Highly Reliable Systems

    DTIC Science & Technology

    1994-03-04

    Exposition," Journal of the American Statistical A iation-59: 353-375 (June 1964). 17. SYS 229, Test and Evaluation Management Coursebook , School of Systems...in hours, 0 is 2-5 the desired MTBCF in hours, R is the number of critical failures, and a is the P[type-I error] of the X2 statistic with 2*R+2...design of experiments (DOE) tables and the use of Bayesian statistics to increase the confidence level of the test results that will be obtained from

  9. Analysis of the Factors Affecting Surgical Success of Implants Placed in Iranian Warfare Victims

    PubMed Central

    Jafarian, Mohammad; Bayat, Mohammad; Pakravan, Amir-Hossein; Emadi, Naghmeh

    2016-01-01

    Objective The aim was to evaluate the survival time and success rates of dental implants in warfare victims and factors that affect implant success. Subjects and Methods This retrospective study involved 250 Iranian warfare victims who received dental implants from 2003 to 2013. Patients' demographic characteristics, as well as the brand, diameter, length, location and failure rate of the implants were retrieved from patients' dental records and radiographs. The associations between these data and the survival rate were analyzed. Statistical analysis was carried out with χ2 and log-rank tests. Results Overall, out of the 1,533 dental implants, 61 (4s%) failed. The maxillary canine area had the highest failure rate [9 of 132 implants (6.8s%)], while the mandibular incisor region had the least number of failures [3 of 147 implants (2.0s%)] and the longest survival time (approximately 3,182 days). Maxillary canine areas had the shortest survival (about 2,996 days). The longest survival time was observed in implants with 11 mm length (3,179.72 ± 30.139 days) and 3.75-4 mm diameter (3,131.161 ± 35.96 days), and the shortest survival was found in implants with 11.5 mm length (2,317.79 ± 18.71 days) and 6.5 mm diameter (2,241.45 ± 182.21 days). Moreover, implants with 10 mm length (10.7s%) and 5.5-6 mm diameter (22.2s%) had the highest failure rate; however, the least failure rate occurred when the implants were 11.5 mm in length (1.9s%) and 3-3.5 mm in diameter (3.1s%). Conclusions The brand, length and diameter of implants affected the survival time, failure rate and time to failure. The location of the implant was not statistically significant regarding the mentioned factors, although it has clinical significance. PMID:27322534

  10. Rotor fragment protection program: Statistics on aircraft gas turbine ngine rotor failures that occurred in U.S. commercial aviation during 1978

    NASA Technical Reports Server (NTRS)

    Delucia, R. A.; Salvino, J. T.

    1981-01-01

    This report presents statistical information relating to the number of gas turbine engine rotor failures which occurred in commercial aviation service use. The predominant failure involved blade fragments, 82.4 percent of which were contained. Although fewer rotor rim, disk, and seal failures occurred, 33.3%, 100% and 50% respectively were uncontained. Sixty-five percent of the 166 rotor failures occurred during the takeoff and climb stages of flight.

  11. PCI fuel failure analysis: a report on a cooperative program undertaken by Pacific Northwest Laboratory and Chalk River Nuclear Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, C.L.; Pankaskie, P.J.; Heasler, P.G.

    Reactor fuel failure data sets in the form of initial power (P/sub i/), final power (P/sub f/), transient increase in power (..delta..P), and burnup (Bu) were obtained for pressurized heavy water reactors (PHWRs), boiling water reactors (BWRs), and pressurized water reactors (PWRs). These data sets were evaluated and used as the basis for developing two predictive fuel failure models, a graphical concept called the PCI-OGRAM, and a nonlinear regression based model called PROFIT. The PCI-OGRAM is an extension of the FUELOGRAM developed by AECL. It is based on a critical threshold concept for stress dependent stress corrosion cracking. The PROFITmore » model, developed at Pacific Northwest Laboratory, is the result of applying standard statistical regression methods to the available PCI fuel failure data and an analysis of the environmental and strain rate dependent stress-strain properties of the Zircaloy cladding.« less

  12. Evaluation of a fault tolerant system for an integrated avionics sensor configuration with TSRV flight data

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Godiwala, P. M.

    1985-01-01

    The performance analysis results of a fault inferring nonlinear detection system (FINDS) using sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment is presented. First, a statistical analysis of the flight recorded sensor data was made in order to determine the characteristics of sensor inaccuracies. Next, modifications were made to the detection and decision functions in the FINDS algorithm in order to improve false alarm and failure detection performance under real modelling errors present in the flight data. Finally, the failure detection and false alarm performance of the FINDS algorithm were analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minute flight data. In general, the detection speed, failure level estimation, and false alarm performance showed a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed was faster for filter measurement sensors soon as MLS than for filter input sensors such as flight control accelerometers.

  13. Gender-Related and Age-Related Differences in Implantable Defibrillator Recipients: Results From the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS").

    PubMed

    Feldman, Alyssa M; Kersten, Daniel J; Chung, Jessica A; Asheld, Wilbur J; Germano, Joseph; Islam, Shahidul; Cohen, Todd J

    2015-12-01

    The purpose of this study was to investigate the influences of gender and age on defibrillator lead failure and patient mortality. The specific influences of gender and age on defibrillator lead failure have not previously been investigated. This study analyzed the differences in gender and age in relation to defibrillator lead failure and mortality of patients in the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS"). PAIDLESS includes all patients at Winthrop University Hospital who underwent defibrillator lead implantation between February 1, 1996 and December 31, 2011. Male and female patients were compared within each age decile, beginning at 15 years old, to analyze lead failure and patient mortality. Statistical analyses were performed using Wilcoxon rank-sum test, Fisher's exact test, Kaplan-Meier analysis, and multivariable Cox regression models. P<.05 was considered statistically significant. No correction for multiple comparisons was performed for the subgroup analyses. A total of 3802 patients (2812 men and 990 women) were included in the analysis. The mean age was 70 ± 13 years (range, 15-94 years). Kaplan-Meier analysis found that between 45 and 54 years of age, leads implanted in women failed significantly faster than in men (P=.03). Multivariable Cox regression models were built to validate this finding, and they confirmed that male gender was an independent protective factor of lead failure in the 45 to 54 years group (for male gender: HR, 0.37; 95% confidence interval, 0.14-0.96; P=.04). Lead survival time for women in this age group was 13.4 years (standard error, 0.6), while leads implanted in men of this age group survived 14.7 years (standard error, 0.3). Although there were significant differences in lead failure, no differences in mortality between the genders were found for any ages or within each decile. This study is the first to compare defibrillator lead failure and patient mortality in relation to gender and age deciles at a single large implanting center. Within the 45 to 54 years group, leads implanted in women failed faster than in men. Male gender was found to be an independent protective factor in lead survival. This study emphasizes the complex interplay between gender and age with respect to implantable defibrillator lead failure and mortality.

  14. Arthrodesis following failed total knee arthroplasty: comprehensive review and meta-analysis of recent literature.

    PubMed

    Damron, T A; McBeath, A A

    1995-04-01

    With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.

  15. Load fatigue performance of four implant-abutment interface designs: effect of torque level and implant system.

    PubMed

    Quek, H C; Tan, Keson B; Nicholls, Jack I

    2008-01-01

    Biomechanical load-fatigue performance data on single-tooth implant systems with different implant-abutment interface designs is lacking in the literature. This study evaluated the load fatigue performance of 4 implant-abutment interface designs (Brånemark-CeraOne; 3i Osseotite-STA abutment; Replace Select-Easy abutment; and Lifecore Stage-1-COC abutment system). The number of load cycles to fatigue failure of 4 implant-abutment designs was tested with a custom rotational load fatigue machine. The effect of increasing and decreasing the tightening torque by 20% respectively on the load fatigue performance was also investigated. Three different tightening torque levels (recommended torque, -20% recommended torque, +20% recommended torque) were applied to the 4 implant systems. There were 12 test groups with 5 samples in each group. The rotational load fatigue machine subjected specimens to a sinusoidally applied 35 Ncm bending moment at a test frequency of 14 Hz. The number of cycles to failure was recorded. A cutoff of 5 x 10(6) cycles was applied as an upper limit. There were 2 implant failures and 1 abutment screw failure in the Brånemark group. Five abutment screw failures and 4 implant failures was recorded for the 3i system. The Replace Select system had 1 implant failure. Five cone screw failures were noted for the Lifecore system. Analysis of variance revealed no statistically significant difference in load cycles to failure for the 4 different implant-abutment systems torqued at recommended torque level. A statistically significant difference was found between the -20% torque group and the +20% torque group (P < .05) for the 3i system. Load fatigue performance and failure location is system specific and related to the design characteristics of the implant-abutment combination. It appeared that if the implant-abutment interface was maintained, load fatigue failure would occur at the weakest point of the implant. It is important to use the torque level recommended by the manufacturer.

  16. Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Xue, D.; Shi, Y.

    2013-01-01

    A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.

  17. Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    NASA Technical Reports Server (NTRS)

    Gaebler, John A.; Tolson, Robert H.

    2010-01-01

    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.

  18. What does postradiotherapy PSA nadir tell us about freedom from PSA failure and progression-free survival in patients with low and intermediate-risk localized prostate cancer?

    PubMed

    DeWitt, K D; Sandler, H M; Weinberg, V; McLaughlin, P W; Roach, M

    2003-09-01

    To determine whether the post-external beam radiotherapy (RT) prostate-specific antigen nadir (nPSA) improves our ability to predict freedom from PSA failure, progression-free survival (PFS), and overall survival. Controversy regarding the importance of nPSA after external beam RT as a prognostic indicator for patients with localized prostate cancer has continued. This analysis was based on the data from 748 patients with low and intermediate-risk localized prostate cancer treated with external beam RT alone. Patients were categorized by nPSA quartile groups with cutpoints of less than 0.3, 0.3 to less than 0.6, 0.6 to less than 1.2, and 1.2 ng/mL or greater. Both univariate and multivariate analyses were used to determine the significance of nPSA on PSA failure (American Society for Therapeutic Radiology Oncology consensus definition), PFS (death after PSA failure), and overall survival (death from any cause). Freedom from PSA failure was strongly associated with nadir quartile groups (P <0.0001). PFS was also significantly different statistically among nadir quartile groups (P = 0.02). No statistically significant difference was found in overall survival associated with nPSA at this point. nPSA is a strong independent predictor of freedom from PSA failure and PFS in patients with low and intermediate-risk localized prostate cancer treated with RT alone. Longer follow-up and larger patient numbers are required to confirm these observations.

  19. Score tests for independence in semiparametric competing risks models.

    PubMed

    Saïd, Mériem; Ghazzali, Nadia; Rivest, Louis-Paul

    2009-12-01

    A popular model for competing risks postulates the existence of a latent unobserved failure time for each risk. Assuming that these underlying failure times are independent is attractive since it allows standard statistical tools for right-censored lifetime data to be used in the analysis. This paper proposes simple independence score tests for the validity of this assumption when the individual risks are modeled using semiparametric proportional hazards regressions. It assumes that covariates are available, making the model identifiable. The score tests are derived for alternatives that specify that copulas are responsible for a possible dependency between the competing risks. The test statistics are constructed by adding to the partial likelihoods for the individual risks an explanatory variable for the dependency between the risks. A variance estimator is derived by writing the score function and the Fisher information matrix for the marginal models as stochastic integrals. Pitman efficiencies are used to compare test statistics. A simulation study and a numerical example illustrate the methodology proposed in this paper.

  20. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  1. Early Ambulation Among Hospitalized Heart Failure Patients Is Associated With Reduced Length of Stay and 30-Day Readmissions.

    PubMed

    Fleming, Lisa M; Zhao, Xin; DeVore, Adam D; Heidenreich, Paul A; Yancy, Clyde W; Fonarow, Gregg C; Hernandez, Adrian F; Kociol, Robb D

    2018-04-01

    Early ambulation (EA) is associated with improved outcomes for mechanically ventilated and stroke patients. Whether the same association exists for patients hospitalized with acute heart failure is unknown. We sought to determine whether EA among patients hospitalized with heart failure is associated with length of stay, discharge disposition, 30-day post discharge readmissions, and mortality. The study population included 369 hospitals and 285 653 patients with heart failure enrolled in the Get With The Guidelines-Heart Failure registry. We used multivariate logistic regression with generalized estimating equations at the hospital level to identify predictors of EA and determine the association between EA and outcomes. Sixty-five percent of patients ambulated by day 2 of the hospital admission. Patient-level predictors of EA included younger age, male sex, and hospitalization outside of the Northeast ( P <0.01 for all). Hospital size and academic status were not predictive. Hospital-level analysis revealed that those hospitals with EA rates in the top 25% were less likely to have a long length of stay (defined as >4 days) compared with those in the bottom 25% (odds ratio, 0.83; confidence interval, 0.73-0.94; P =0.004). Among a subgroup of fee-for-service Medicare beneficiaries, we found that hospitals in the highest quartile of rates of EA demonstrated a statistically significant 24% lower 30-day readmission rates ( P <0.0001). Both end points demonstrated a dose-response association and statistically significant P for trend test. Multivariable-adjusted hospital-level analysis suggests an association between EA and both shorter length of stay and lower 30-day readmissions. Further prospective studies are needed to validate these findings. © 2018 American Heart Association, Inc.

  2. Factors that Affect Operational Reliability of Turbojet Engines

    NASA Technical Reports Server (NTRS)

    1956-01-01

    The problem of improving operational reliability of turbojet engines is studied in a series of papers. Failure statistics for this engine are presented, the theory and experimental evidence on how engine failures occur are described, and the methods available for avoiding failure in operation are discussed. The individual papers of the series are Objectives, Failure Statistics, Foreign-Object Damage, Compressor Blades, Combustor Assembly, Nozzle Diaphrams, Turbine Buckets, Turbine Disks, Rolling Contact Bearings, Engine Fuel Controls, and Summary Discussion.

  3. Single- versus double-row repair for full-thickness rotator cuff tears using suture anchors. A systematic review and meta-analysis of basic biomechanical studies.

    PubMed

    Hohmann, Erik; König, Anya; Kat, Cor-Jacques; Glatt, Vaida; Tetsworth, Kevin; Keough, Natalie

    2018-07-01

    The purpose of this study was to perform a systematic review and meta-analysis comparing single- and double-row biomechanical studies to evaluate load to failure, mode of failure and gap formation. A systematic review of MEDLINE, Embase, Scopus and Google Scholar was performed from 1990 through 2016. The inclusion criteria were: documentation of ultimate load to failure, failure modes and documentation of elongation or gap formation. Studies were excluded if the study protocol did not use human specimens. Publication bias was assessed by funnel plot and Egger's test. The risk of bias was established using the Cochrane Collaboration's risk of bias tool. Heterogeneity was assessed using χ 2 and I 2 statistic. Eight studies were included. The funnel plot was asymmetric suggesting publication bias, which was confirmed by Egger's test (p = 0.04). The pooled estimate for load to failure demonstrated significant differences (SMD 1.228, 95% CI: 0.55-5.226, p = 0.006, I 2  = 60.47%), favouring double-row repair. There were no differences for failure modes. The pooled estimate for elongation/gap formation demonstrated significant differences (SMD 0.783, 95% CI: 0.169-1.398, p = 0.012, I 2  = 58.8%), favouring double-row repair. The results of this systematic review and meta-analysis suggest that double-row repair is able to tolerate a significantly greater load to failure. Gap formation was also significantly lower in the double-row repair group, but both of these findings should be interpreted with caution because of the inherent interstudy heterogeneity. Systematic review and meta-analysis.

  4. Long- and short-time analysis of heartbeat sequences: correlation with mortality risk in congestive heart failure patients.

    PubMed

    Allegrini, P; Balocchi, R; Chillemi, S; Grigolini, P; Hamilton, P; Maestri, R; Palatella, L; Raffaelli, G

    2003-06-01

    We analyze RR heartbeat sequences with a dynamic model that satisfactorily reproduces both the long- and the short-time statistical properties of heart beating. These properties are expressed quantitatively by means of two significant parameters, the scaling delta concerning the asymptotic effects of long-range correlation, and the quantity 1-pi establishing the amount of uncorrelated fluctuations. We find a correlation between the position in the phase space (delta, pi) of patients with congestive heart failure and their mortality risk.

  5. Vitamin D status of severe COPD patients with chronic respiratory failure.

    PubMed

    Gawron, Grzegorz; Trzaska-Sobczak, Marzena; Sozańska, Ewa; Śnieżek, Piotr; Barczyk, Adam

    2018-01-01

    The aim of the study was to measure the concentrations of vitamin D in serum of COPD patients with chronic respiratory failure in comparison to healthy control group. The correlation between the levels of vitamin D in serum and the selected clinical, spirometric and blood gas parameters was the additional aim of the study. The study included 61 patients with diagnosed COPD in stadium of chronic respiratory failure (45 men and 16 women) and 37 healthy controls (19 men and 18 women). The following procedure were performed in all studied subjects: detailed history (especially: daily activity, diet, tobacco and alcohol use), post-bronchodilator spirometry, assessment of 25(OH)D in serum and for COPD group only blood gas analysis. Recruitment for the study was conducted from November to April. Statistical analysis was performed using the following statistical methods: t-Student test, Mann-Whitney U test, Spearman correlation test and Chi-kwadrat test. There was no significant differences between COPD and control group for the levels of 25(OH)D in serum. Median and lower; upper quartile were respectively following: 24,75 nmol/l (16,9; 36,4) vs. 24,06 nmol/l (16,3; 37,2), p=0,69. Vitamin D deficiency was present in 60 COPD patients (98,3% of all patients) and in 36 control group subject (97,3% of all healthy volunteers). The difference was not statistically significant. The levels of vitamin D in serum did not significantly correlated with any of studied parameters (spirometry, blood gas, age, the level of activity, BMI, tobacco smoke exposure and others). However, the level of activity in COPD group correlated positively with spirometry values and negatively with age and number of exacerbations. The results of the study showed that in autumn-winter time in Poland there are very frequent deficiency of vitamin D in serum not only in COPD patients in respiratory failure stage but also in elderly healthy persons. However, in contrary to expectations the deficiency of vitamin D in COPD patients with respiratory failure were similar to that seen in healthy persons.

  6. Mitigation of Late Renal and Pulmonary Injury After Hematopoietic Stem Cell Transplantation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Eric P., E-mail: Eric.Cohen2@va.gov; Bedi, Manpreet; Irving, Amy A.

    Purpose: To update the results of a clinical trial that assessed whether the angiotensin-converting enzyme inhibitor captopril was effective in mitigating chronic renal failure and pulmonary-related mortality in subjects undergoing total body irradiation (TBI) in preparation for hematopoietic stem cell transplantation (HSCT). Methods and Materials: Updated records of the 55 subjects who were enrolled in this randomized controlled trial were analyzed. Twenty-eight patients received captopril, and 27 patients received placebo. Definitions of TBI-HSCT-related chronic renal failure (and relapse) were the same as those in the 2007 analysis. Pulmonary-related mortality was based on clinical or autopsy findings of pulmonary failure ormore » infection as the primary cause of death. Follow-up data for overall and pulmonary-related mortality were supplemented by use of the National Death Index. Results: The risk of TBI-HSCT-related chronic renal failure was lower in the captopril group (11% at 4 years) than in the placebo group (17% at 4 years), but this was not statistically significant (p > 0.2). Analysis of mortality was greatly extended by use of the National Death Index, and no patients were lost to follow-up for reasons other than death prior to 67 months. Patient survival was higher in the captopril group than in the placebo group, but this was not statistically significant (p > 0.2). The improvement in survival was influenced more by a decrease in pulmonary mortality (11% risk at 4 years in the captopril group vs. 26% in the placebo group, p = 0.15) than by a decrease in chronic renal failure. There was no adverse effect on relapse risk (p = 0.4). Conclusions: Captopril therapy produces no detectable adverse effects when given after TBI. Captopril therapy reduces overall and pulmonary-related mortality after radiation-based HSCT, and there is a trend toward mitigation of chronic renal failure.« less

  7. Survival rates of short (6 mm) micro-rough surface implants: a review of literature and meta-analysis.

    PubMed

    Srinivasan, Murali; Vazquez, Lydia; Rieder, Philippe; Moraguez, Osvaldo; Bernard, Jean-Pierre; Belser, Urs C

    2014-05-01

    The aim of this review was to test the hypothesis that 6 mm micro-rough short Straumann(®) implants provide predictable survival rates and verify that most failures occurring are early failures. A PubMed and hand search was performed to identify studies involving micro-rough 6-mm-short implants published between January 1987 and August 2011. Studies were included that (i) involve Straumann(®) 6 mm implants placed in the human jaws, (ii) provide data on the survival rate, (iii) mention the time of failure, and (iv) report a minimum follow-up period of 12 months following placement. A meta-analysis was performed on the extracted data. From a total of 842 publications that were screened, 12 methodologically sound articles qualified to be included for the statistical evaluation based on our inclusion criteria. A total of 690 Straumann(®) 6-mm-short implants were evaluated in the reviewed studies (Total: placed-690, failed-25; maxilla: placed-266, failed-14; mandible: placed-364, failed-5; follow-up period: 1-8 years). A meta-analysis was performed on the calculated early cumulative survival rates (CSR%). The pooled early CSR% calculated in this meta-analysis was 93.7%, whereas the overall survival rates in the maxilla and mandible were 94.7% and 98.6% respectively. Implant failures observed were predominantly early failures (76%). This meta-analysis provides robust evidence that micro-rough 6-mm-short dental implants are a predictable treatment option, providing favorable survival rates. The failures encountered with 6-mm-short implants were predominantly early and their survival in the mandible was slightly superior. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  8. A statistical-based material and process guidelines for design of carbon nanotube field-effect transistors in gigascale integrated circuits.

    PubMed

    Ghavami, Behnam; Raji, Mohsen; Pedram, Hossein

    2011-08-26

    Carbon nanotube field-effect transistors (CNFETs) show great promise as building blocks of future integrated circuits. However, synthesizing single-walled carbon nanotubes (CNTs) with accurate chirality and exact positioning control has been widely acknowledged as an exceedingly complex task. Indeed, density and chirality variations in CNT growth can compromise the reliability of CNFET-based circuits. In this paper, we present a novel statistical compact model to estimate the failure probability of CNFETs to provide some material and process guidelines for the design of CNFETs in gigascale integrated circuits. We use measured CNT spacing distributions within the framework of detailed failure analysis to demonstrate that both the CNT density and the ratio of metallic to semiconducting CNTs play dominant roles in defining the failure probability of CNFETs. Besides, it is argued that the large-scale integration of these devices within an integrated circuit will be feasible only if a specific range of CNT density with an acceptable ratio of semiconducting to metallic CNTs can be adjusted in a typical synthesis process.

  9. A meta-analysis of the association between diabetic patients and AVF failure in dialysis.

    PubMed

    Yan, Yan; Ye, Dan; Yang, Liu; Ye, Wen; Zhan, Dandan; Zhang, Li; Xiao, Jun; Zeng, Yan; Chen, Qinkai

    2018-11-01

    The most preferable vascular access for patients with end-stage renal failure needing hemodialysis is native arteriovenous fistula (AVF) on account of its access longevity, patient morbidity, hospitalization costs, lower risks of infection and fewer incidence of thrombotic complications. Meanwhile, according to National Kidney Foundation (NKF)̸Dialysis Out-comes Quality Initiative (DOQI) guidelines, AVF is more used than before. However, a significant percentage of AVF fails to support dialysis therapy due to lack of adequate maturity. Among all factors, the presence of diabetes mellitus was shown to be one of the risk factors for the development of vascular access failure by some authors. Therefore, this review evaluates the current evidence concerning the correlation of diabetes and AVF failure. A search was conducted using MEDLINE, SCIENCE DIRECT, SPRINGER, WILEY-BLACKWELL, KARGER, EMbase, CNKI and WanFang Data from the establishment time of databases to January 2016. The analysis involved studies that contained subgroups of diabetic patients and compared their outcomes with those of non-diabetic adults. In total, 23 articles were retrieved and included in the review. The meta-analysis revealed a statistically significantly higher rate of AVF failure in diabetic patients compared with non-diabetic patients (OR = 1.682; 95% CI, 1.429-1.981, Test of OR = 1: z = 6.25, p <.001). This review found an increased risk of AVF failure in diabetes patients. If confirmed by further prospective studies, preventive measure should be considered when planning AVF in diabetic patients.

  10. Performance analysis of a fault inferring nonlinear detection system algorithm with integrated avionics flight data

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Godiwala, P. M.; Morrell, F. R.

    1985-01-01

    This paper presents the performance analysis results of a fault inferring nonlinear detection system (FINDS) using integrated avionics sensor flight data for the NASA ATOPS B-737 aircraft in a Microwave Landing System (MLS) environment. First, an overview of the FINDS algorithm structure is given. Then, aircraft state estimate time histories and statistics for the flight data sensors are discussed. This is followed by an explanation of modifications made to the detection and decision functions in FINDS to improve false alarm and failure detection performance. Next, the failure detection and false alarm performance of the FINDS algorithm are analyzed by injecting bias failures into fourteen sensor outputs over six repetitive runs of the five minutes of flight data. Results indicate that the detection speed, failure level estimation, and false alarm performance show a marked improvement over the previously reported simulation runs. In agreement with earlier results, detection speed is faster for filter measurement sensors such as MLS than for filter input sensors such as flight control accelerometers. Finally, the progress in modifications of the FINDS algorithm design to accommodate flight computer constraints is discussed.

  11. Perspectives and realities of teaching statistics at a superior school of business administration

    NASA Astrophysics Data System (ADS)

    Nunes, Sandra

    2016-06-01

    This paper aims to describe the reality of the teaching of statistics in a superior school of business administration in Portugal. It is supported in a twenty years of experience teaching several disciplines belonging to the scientific area of Mathematics such as: Statistics and Probability, Data Analysis, Calculus, Algebra and Numerical Analysis. This experience is not limited to school of business administration but also in engineering and health courses and in all these schools there has been a substantial increase of failure in these disciplines. I intend to present the main difficulties that teachers encounter. These difficulties are due to a diversity of problems. A leading cause is undoubtedly the huge heterogeneity of the level of knowledge that students have. The large number of students in each class it is also a massive problem. I must point out that, in my opinion, the introduction of the Bologna process has aggravated this situation. The assumption of reduced classroom hours and an increase in self-study is extremely penalizing for such students. There are many challenges that teachers have to face: How to teach statistics to a class where more than half the students cannot interpret the basic concepts of mathematics? Is the approach of teaching statistics through software beneficial? Should the teaching of statistics be addressed in a more practical way? How can we install a critical thinking in the students, to enable them to use the knowledge acquired to solve problems? How can we deal and prevent the failure that is increasing each year? These are only a few questions that all the teachers need an answer.

  12. A study on risk factors and diagnostic efficiency of posthepatectomy liver failure in the nonobstructive jaundice.

    PubMed

    Wang, He; Lu, Shi-Chun; He, Lei; Dong, Jia-Hong

    2018-02-01

    Liver failure remains as the most common complication and cause of death after hepatectomy, and continues to be a challenge for doctors.t test and χ test were used for single factor analysis of data-related variables, then results were introduced into the model to undergo the multiple factors logistic regression analysis. Pearson correlation analysis was performed for related postoperative indexes, and a diagnostic evaluation was performed using the receiver operating characteristic (ROC) of postoperative indexes.Differences in age, body mass index (BMI), portal vein hypertension, bile duct cancer, total bilirubin, alkaline phosphatase (ALP), gamma-glutamyl transpeptidase (GGT), operation time, cumulative portal vein occlusion time, intraoperative blood volume, residual liver volume (RLV)/entire live rvolume, ascites volume at postoperative day (POD)3, supplemental albumin amount at POD3, hospitalization time after operation, and the prothrombin activity (PTA) were statistically significant. Furthermore, there were significant differences in total bilirubin and the supplemental albumin amount at POD3. ROC analysis of the average PTA, albumin amounts, ascites volume at POD3, and their combined diagnosis were performed, which had diagnostic value for postoperative liver failure (area under the curve (AUC): 0.895, AUC: 0.798, AUC: 0.775, and AUC: 0.903).Preoperative total bilirubin level and the supplemental albumin amount at POD3 were independent risk factors. PTA can be used as the index of postoperative liver failure, and the combined diagnosis of the indexes can improve the early prediction of postoperative liver failure.

  13. Analysing recurrent hospitalizations in heart failure: a review of statistical methodology, with application to CHARM-Preserved.

    PubMed

    Rogers, Jennifer K; Pocock, Stuart J; McMurray, John J V; Granger, Christopher B; Michelson, Eric L; Östergren, Jan; Pfeffer, Marc A; Solomon, Scott D; Swedberg, Karl; Yusuf, Salim

    2014-01-01

    Heart failure is characterized by recurrent hospitalizations, but often only the first event is considered in clinical trial reports. In chronic diseases, such as heart failure, analysing all events gives a more complete picture of treatment benefit. We describe methods of analysing repeat hospitalizations, and illustrate their value in one major trial. The Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity (CHARM)-Preserved study compared candesartan with placebo in 3023 patients with heart failure and preserved systolic function. The heart failure hospitalization rates were 12.5 and 8.9 per 100 patient-years in the placebo and candesartan groups, respectively. The repeat hospitalizations were analysed using the Andersen-Gill, Poisson, and negative binomial methods. Death was incorporated into analyses by treating it as an additional event. The win ratio method and a method that jointly models hospitalizations and mortality were also considered. Using repeat events gave larger treatment benefits than time to first event analysis. The negative binomial method for the composite of recurrent heart failure hospitalizations and cardiovascular death gave a rate ratio of 0.75 [95% confidence interval (CI) 0.62-0.91, P = 0.003], whereas the hazard ratio for time to first heart failure hospitalization or cardiovascular death was 0.86 (95% CI 0.74-1.00, P = 0.050). In patients with preserved EF, candesartan reduces the rate of admissions for worsening heart failure, to a greater extent than apparent from analysing only first hospitalizations. Recurrent events should be routinely incorporated into the analysis of future clinical trials in heart failure. © 2013 The Authors. European Journal of Heart Failure © 2013 European Society of Cardiology.

  14. Reliability of lead-calcium automotive batteries in practical operations

    NASA Astrophysics Data System (ADS)

    Burghoff, H.-G.; Richter, G.

    In order to reach a statistically sound conclusion on the suitability of maintenance-free, lead-calcium automotive batteries for practical operations, the failure behaviour of such batteries has been observed in a large-scale experiment carried out by Mercedes Benz AG and Robert Bosch GmbH in different climatic zones of North America. The results show that the average failure behaviour is not significantly different to that of batteries from other manufacturers using other grid alloy systems and operated under otherwise identical conditions; the cumulative failure probability after 30 months is 17%. The principal causes of failure are: (i) early failure: transport damage, filling errors, and short-circuits due to the outer plates being pushed up during plate-block assembly (manufacturing defect); (ii) statistical failure: short-circuits due to growth of positive plates caused by a reduction in the mechanical strength of the cast positive grid as a result of corrosion; (iii) late failure due to an increased occurrence of short-circuits, especially frequent in outer cell facing the engine of the vehicle (subjected to high temperature), and to defects caused by capacity decay. As expected, the batteries exhibit extremely low water loss in each cell. The poor cyclical performance of stationary batteries, caused by acid stratification and well-known from laboratory tests, has no detrimental effect on the batteries in use. After a thorough analysis of the corrosion process, the battery manufacturer changed the grid alloy and the method of its production, and thus limited the corrosion problem with cast lead-calcium grids and with it the possibility of plate growth. The mathematical methods used in this study, and in particular the characteristic factors derived from them, have proven useful for assessing the suitability of automotive batteries.

  15. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  16. Analysis of complications following augmentation with cancellous block allografts.

    PubMed

    Chaushu, Gavriel; Mardinger, Ofer; Peleg, Michael; Ghelfan, Oded; Nissan, Joseph

    2010-12-01

    Bone grafting may be associated with soft and hard tissue complications. Recipient site complications encountered using cancellous block allografts for ridge augmentation are analyzed. A total of 101 consecutive patients (62 females and 39 males; mean age 44 ± 17 years) were treated with implant-supported restoration of 137 severe atrophic alveolar ridges augmented with cancellous bone-block allografts. Alveolar ridge deficiency locations were classified as anterior maxilla (n = 58); posterior maxilla (n = 32 sinuses); posterior mandible (n = 32); and anterior mandible (n = 15). A total of 271 rough-surface implants were placed. Recipient site complications associated with block grafting (infection, membrane exposure, incision line opening, perforation of mucosa over the grafted bone, partial graft failure, total graft failure, and implant failure) were recorded. Partial and total bone-block graft failure occurred in 10 (7%) and 11 (8%) of 137 augmented sites, respectively. Implant failure rate was 12 (4.4%) of 271. Soft tissue complications included membrane exposure (42 [30.7%] of 137); incision line opening (41 [30%] of 137); and perforation of the mucosa over the grafted bone (19 [14%] of 137). Infection of the grafted site occurred in 18 (13%) of 137 bone blocks. Alveolar ridge deficiency location had a statistically significant effect on the outcome of recipient site complications. More complications were noted in the mandible compared to the maxilla. Age and gender had no statistically significant effect. Failures caused by complications were rarely noted in association with cancellous block grafting. The incidence of complications in the mandible was significantly higher. Soft tissue complications do not necessarily result in total loss of cancellous block allograft.

  17. Rotor burst protection program: Statistics on aircraft gas turbine engine rotor failures that occurred in US commercial aviation during 1975

    NASA Technical Reports Server (NTRS)

    Delucia, R. A.; Mangano, G. J.

    1977-01-01

    Statistics on gas turbine rotor failures that have occurred in U.S. commercial aviation during 1975 are presented. The compiled data were analyzed to establish: (1) The incidence of rotor failures and the number of contained and uncontained rotor bursts; (2) The distribution of rotor bursts with respect to engine rotor component; i.e., fan, compressor or turbine; (3) The type of rotor fragment (disk, rim or blade) typically generated at burst; (4) The cause of failure; (5) The type of engines involved; and (6) The flight condition at the time of failure.

  18. Reliability growth modeling analysis of the space shuttle main engines based upon the Weibull process

    NASA Technical Reports Server (NTRS)

    Wheeler, J. T.

    1990-01-01

    The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.

  19. Seismic precursory patterns before a cliff collapse and critical point phenomena

    USGS Publications Warehouse

    Amitrano, D.; Grasso, J.-R.; Senfaute, G.

    2005-01-01

    We analyse the statistical pattern of seismicity before a 1-2 103 m3 chalk cliff collapse on the Normandie ocean shore, Western France. We show that a power law acceleration of seismicity rate and energy in both 40 Hz-1.5 kHz and 2 Hz-10kHz frequency range, is defined on 3 orders of magnitude, within 2 hours from the collapse time. Simultaneously, the average size of the seismic events increases toward the time to failure. These in situ results are derived from the only station located within one rupture length distance from the rock fall rupture plane. They mimic the "critical point" like behavior recovered from physical and numerical experiments before brittle failures and tertiary creep failures. Our analysis of this first seismic monitoring data of a cliff collapse suggests that the thermodynamic phase transition models for failure may apply for cliff collapse. Copyright 2005 by the American Geophysical Union.

  20. Relating design and environmental variables to reliability

    NASA Astrophysics Data System (ADS)

    Kolarik, William J.; Landers, Thomas L.

    The combination of space application and nuclear power source demands high reliability hardware. The possibilities of failure, either an inability to provide power or a catastrophic accident, must be minimized. Nuclear power experiences on the ground have led to highly sophisticated probabilistic risk assessment procedures, most of which require quantitative information to adequately assess such risks. In the area of hardware risk analysis, reliability information plays a key role. One of the lessons learned from the Three Mile Island experience is that thorough analyses of critical components are essential. Nuclear grade equipment shows some reliability advantages over commercial. However, no statistically significant difference has been found. A recent study pertaining to spacecraft electronics reliability, examined some 2500 malfunctions on more than 300 aircraft. The study classified the equipment failures into seven general categories. Design deficiencies and lack of environmental protection accounted for about half of all failures. Within each class, limited reliability modeling was performed using a Weibull failure model.

  1. [Evaluation of the capacity of the APR-DRG classification system to predict hospital mortality].

    PubMed

    De Marco, Maria Francesca; Lorenzoni, Luca; Addari, Piero; Nante, Nicola

    2002-01-01

    Inpatient mortality has increasingly been used as an hospital outcome measure. Comparing mortality rates across hospitals requires adjustment for patient risks before making inferences about quality of care based on patient outcomes. Therefore it is essential to dispose of well performing severity measures. The aim of this study is to evaluate the ability of the All Patient Refined DRG system to predict inpatient mortality for congestive heart failure, myocardial infarction, pneumonia and ischemic stroke. Administrative records were used in this analysis. We used two statistics methods to assess the ability of the APR-DRG to predict mortality: the area under the receiver operating characteristics curve (referred to as the c-statistic) and the Hosmer-Lemeshow test. The database for the study included 19,212 discharges for stroke, pneumonia, myocardial infarction and congestive heart failure from fifteen hospital participating in the Italian APR-DRG Project. A multivariate analysis was performed to predict mortality for each condition in study using age, sex and APR-DRG risk mortality subclass as independent variables. Inpatient mortality rate ranges from 9.7% (pneumonia) to 16.7% (stroke). Model discrimination, calculated using the c-statistic, was 0.91 for myocardial infarction, 0.68 for stroke, 0.78 for pneumonia and 0.71 for congestive heart failure. The model calibration assessed using the Hosmer-Leme-show test was quite good. The performance of the APR-DRG scheme when used on Italian hospital activity records is similar to that reported in literature and it seems to improve by adding age and sex to the model. The APR-DRG system does not completely capture the effects of these variables. In some cases, the better performance might be due to the inclusion of specific complications in the risk-of-mortality subclass assignment.

  2. Local Failure in Resected N1 Lung Cancer: Implications for Adjuvant Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Higgins, Kristin A., E-mail: kristin.higgins@duke.edu; Chino, Junzo P.; Berry, Mark

    2012-06-01

    Purpose: To evaluate actuarial rates of local failure in patients with pathologic N1 non-small-cell lung cancer and to identify clinical and pathologic factors associated with an increased risk of local failure after resection. Methods and Materials: All patients who underwent surgery for non-small-cell lung cancer with pathologically confirmed N1 disease at Duke University Medical Center from 1995-2008 were identified. Patients receiving any preoperative therapy or postoperative radiotherapy or with positive surgical margins were excluded. Local failure was defined as disease recurrence within the ipsilateral hilum, mediastinum, or bronchial stump/staple line. Actuarial rates of local failure were calculated with the Kaplan-Meiermore » method. A Cox multivariate analysis was used to identify factors independently associated with a higher risk of local recurrence. Results: Among 1,559 patients who underwent surgery during the time interval, 198 met the inclusion criteria. Of these patients, 50 (25%) received adjuvant chemotherapy. Actuarial (5-year) rates of local failure, distant failure, and overall survival were 40%, 55%, and 33%, respectively. On multivariate analysis, factors associated with an increased risk of local failure included a video-assisted thoracoscopic surgery approach (hazard ratio [HR], 2.5; p = 0.01), visceral pleural invasion (HR, 2.1; p = 0.04), and increasing number of positive N1 lymph nodes (HR, 1.3 per involved lymph node; p = 0.02). Chemotherapy was associated with a trend toward decreased risk of local failure that was not statistically significant (HR, 0.61; p = 0.2). Conclusions: Actuarial rates of local failure in pN1 disease are high. Further investigation of conformal postoperative radiotherapy may be warranted.« less

  3. Biomechanical Comparison of Parallel and Crossed Suture Repair for Longitudinal Meniscus Tears.

    PubMed

    Milchteim, Charles; Branch, Eric A; Maughon, Ty; Hughey, Jay; Anz, Adam W

    2016-04-01

    Longitudinal meniscus tears are commonly encountered in clinical practice. Meniscus repair devices have been previously tested and presented; however, prior studies have not evaluated repair construct designs head to head. This study compared a new-generation meniscus repair device, SpeedCinch, with a similar established device, Fast-Fix 360, and a parallel repair construct to a crossed construct. Both devices utilize self-adjusting No. 2-0 ultra-high molecular weight polyethylene (UHMWPE) and 2 polyether ether ketone (PEEK) anchors. Crossed suture repair constructs have higher failure loads and stiffness compared with simple parallel constructs. The newer repair device would exhibit similar performance to an established device. Controlled laboratory study. Sutures were placed in an open fashion into the body and posterior horn regions of the medial and lateral menisci in 16 cadaveric knees. Evaluation of 2 repair devices and 2 repair constructs created 4 groups: 2 parallel vertical sutures created with the Fast-Fix 360 (2PFF), 2 crossed vertical sutures created with the Fast-Fix 360 (2XFF), 2 parallel vertical sutures created with the SpeedCinch (2PSC), and 2 crossed vertical sutures created with the SpeedCinch (2XSC). After open placement of the repair construct, each meniscus was explanted and tested to failure on a uniaxial material testing machine. All data were checked for normality of distribution, and 1-way analysis of variance by ranks was chosen to evaluate for statistical significance of maximum failure load and stiffness between groups. Statistical significance was defined as P < .05. The mean maximum failure loads ± 95% CI (range) were 89.6 ± 16.3 N (125.7-47.8 N) (2PFF), 72.1 ± 11.7 N (103.4-47.6 N) (2XFF), 71.9 ± 15.5 N (109.4-41.3 N) (2PSC), and 79.5 ± 25.4 N (119.1-30.9 N) (2XSC). Interconstruct comparison revealed no statistical difference between all 4 constructs regarding maximum failure loads (P = .49). Stiffness values were also similar, with no statistical difference on comparison (P = .28). Both devices in the current study had similar failure load and stiffness when 2 vertical or 2 crossed sutures were tested in cadaveric human menisci. Simple parallel vertical sutures perform similarly to crossed suture patterns at the time of implantation.

  4. Prediction of Muscle Performance During Dynamic Repetitive Exercise

    NASA Technical Reports Server (NTRS)

    Byerly, D. L.; Byerly, K. A.; Sognier, M. A.; Squires, W. G.

    2002-01-01

    A method for predicting human muscle performance was developed. Eight test subjects performed a repetitive dynamic exercise to failure using a Lordex spinal machine. Electromyography (EMG) data was collected from the erector spinae. Evaluation of the EMG data using a 5th order Autoregressive (AR) model and statistical regression analysis revealed that an AR parameter, the mean average magnitude of AR poles, can predict performance to failure as early as the second repetition of the exercise. Potential applications to the space program include evaluating on-orbit countermeasure effectiveness, maximizing post-flight recovery, and future real-time monitoring capability during Extravehicular Activity.

  5. Expression of FOXP3, CD68, and CD20 at Diagnosis in the Microenvironment of Classical Hodgkin Lymphoma Is Predictive of Outcome

    PubMed Central

    Greaves, Paul; Clear, Andrew; Coutinho, Rita; Wilson, Andrew; Matthews, Janet; Owen, Andrew; Shanyinde, Milensu; Lister, T. Andrew; Calaminici, Maria; Gribben, John G.

    2013-01-01

    Purpose The immune microenvironment is key to the pathophysiology of classical Hodgkin lymphoma (CHL). Twenty percent of patients experience failure of their initial treatment, and others receive excessively toxic treatment. Prognostic scores and biomarkers have yet to influence outcomes significantly. Previous biomarker studies have been limited by the extent of tissue analyzed, statistical inconsistencies, and failure to validate findings. We aimed to overcome these limitations by validating recently identified microenvironment biomarkers (CD68, FOXP3, and CD20) in a new patient cohort with a greater extent of tissue and by using rigorous statistical methodology. Patients and Methods Diagnostic tissue from 122 patients with CHL was microarrayed and stained, and positive cells were counted across 10 to 20 high-powered fields per patient by using an automated system. Two statistical analyses were performed: a categorical analysis with test/validation set-defined cut points and Kaplan-Meier estimated outcome measures of 5-year overall survival (OS), disease-specific survival (DSS), and freedom from first-line treatment failure (FFTF) and an independent multivariate analysis of absolute uncategorized counts. Results Increased CD20 expression confers superior OS. Increased FOXP3 expression confers superior OS, and increased CD68 confers inferior FFTF and OS. FOXP3 varies independently of CD68 expression and retains significance when analyzed as a continuous variable in multivariate analysis. A simple score combining FOXP3 and CD68 discriminates three groups: FFTF 93%, 62%, and 47% (P < .001), DSS 93%, 82%, and 63% (P = .03), and OS 93%, 82%, and 59% (P = .002). Conclusion We have independently validated CD68, FOXP3, and CD20 as prognostic biomarkers in CHL, and we demonstrate, to the best of our knowledge for the first time, that combining FOXP3 and CD68 may further improve prognostic stratification. PMID:23045593

  6. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zauls, A. Jason, E-mail: zauls@musc.edu; Watkins, John M.; Wahlquist, Amy E.

    Purpose: The American Society for Radiation Oncology published a Consensus Statement for accelerated partial breast irradiation identifying three groups: Suitable, Cautionary, and Unsuitable. The objective of this study was to compare oncologic outcomes in women treated with MammoSite brachytherapy (MB) vs. whole breast irradiation (WBI) after stratification into Statement groups. Methods: Eligible women had invasive carcinoma or ductal carcinoma in situ (DCIS) {<=}3 cm, and {<=}3 lymph nodes positive. Women were stratified by radiation modality and Statement groups. Survival analysis methods including Kaplan-Meier estimation, Cox regression, and competing risks analysis were used to assess overall survival (OS), disease-free survival (DFS),more » time to local failure (TTLF), and tumor bed failure (TBF). Results: A total of 459 (183 MB and 276 WBI) patients were treated from 2002 to 2009. After a median follow-up of 45 months, we found no statistical differences by stratification group or radiation modality with regard to OS and DFS. At 4 years TTLF or TBF were not statistically different between the cohorts. Univariate analysis in the MB cohort revealed that nodal positivity (pN1 vs. pN0) was related to TTLF (hazard ratio 6.39, p = 0.02). There was a suggestion that DCIS histology had an increased risk of failure when compared with invasive ductal carcinoma (hazard ratio 3.57, p = 0.06). Conclusions: MB and WBI patients stratified by Statement groups seem to combine women who will have similar outcomes regardless of radiation modality. Although outcomes were similar, we remain guarded in overinterpretation of these preliminary results until further analysis and long-term follow-up data become available. Caution should be used in treating women with DCIS or pN1 disease with MB.« less

  8. Risk of renal failure with the non-vitamin K antagonist oral anticoagulants: systematic review and meta-analysis.

    PubMed

    Caldeira, Daniel; Gonçalves, Nilza; Pinto, Fausto J; Costa, João; Ferreira, Joaquim J

    2015-07-01

    Vitamin K antagonists (VKA)-related nephropathy is a novel entity characterized by acute kidney injury related to International Normalized Ratio supratherapeutic levels. Non-vitamin K antagonists oral anticoagulants (NOACs) have a predictable dose-response relationship and an improved safety profile. We hypothesized that these drugs do not have an increased risk of incident renal failure, which may be detrimental for the use of NOACs. Systematic review and meta-analysis of phase III randomized controlled trials (RCTs). Trials were searched through Medline, Cochrane Library and public assessment reports in August 2014. Primary outcome was renal failure. NOACs were evaluated against any comparator. Random-effects meta-analysis was performed by default, and pooled estimates were expressed as Risk Ratio (RR) and 95%CI. Heterogeneity was evaluated with I(2) test. Ten RCTs fulfilled inclusion criteria (one apixaban RCT, three dabigatran RCTs, and six rivaroxaban RCTs), enrolling 75 100 patients. Overall NOACs did not increase the risk of renal failure with an RR 0.96, 95%CI 0.88-1.05 compared with VKA or Low-molecular weight heparin (LMWH), without significant statistical heterogeneity (I(2)  = 3.5%). Compared with VKA, NOACs did not increase the risk of renal failure (RR 0.96, 95%CI 0.87-1.07; I(2)  = 17.8%; six RCTs). Rivaroxaban did not show differences in the incidence of renal failure compared with LMWH (RR 1.20, 95%CI 0.37-3.94; four trials), but there was an increased risk of creatinine elevation RR 1.25, 95%CI 1.08-1.45; I(2)  = 0%. NOACs had a similar risk of renal failure compared with VKA/LMWH in phase III RCTs. Post-marketing surveillance should be warranted. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.

    2012-12-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  10. [Analysis of the failures of a cemented constrained liner model in patients with a high dislocation risk].

    PubMed

    Gallart, X; Gomez, J C; Fernández-Valencia, J A; Combalía, A; Bori, G; García, S; Rios, J; Riba, J

    2014-01-01

    To evaluate the short-term results of an ultra high molecular weight polyethylene retentive cup in patients at high risk of dislocation, either primary or revision surgery. Retrospective review of 38 cases in order to determine the rate of survival and failure analysis of a constrained cemented cup, with a mean follow-up of 27 months. We studied demographic data, complications, especially re-dislocations of the prosthesis and, also the likely causes of system failure analyzed. In 21.05% (8 cases) were primary surgery and 78.95% were revision surgery (30 cases). The overall survival rate by Kaplan-Meier method was 70.7 months. During follow-up 3 patients died due to causes unrelated to surgery and 2 infections occurred. 12 hips had at least two previous surgeries done. It wasn't any case of aseptic loosening. Four patients presented dislocation, all with a 22 mm head (P=.008). Our statistical analysis didn't found relationship between the abduction cup angle and implant failure (P=.22). The ultra high molecular weight polyethylene retentive cup evaluated in this series has provided satisfactory short-term results in hip arthroplasty patients at high risk of dislocation. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  11. Advanced Gear Alloys for Ultra High Strength Applications

    NASA Technical Reports Server (NTRS)

    Shen, Tony; Krantz, Timothy; Sebastian, Jason

    2011-01-01

    Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.

  12. Sequential experimental design based generalised ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-07-01

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  13. Sequential experimental design based generalised ANOVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less

  14. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  15. Application of a truncated normal failure distribution in reliability testing

    NASA Technical Reports Server (NTRS)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  16. Online Graduate Teacher Education: Establishing an EKG for Student Success Intervention

    ERIC Educational Resources Information Center

    Shelton, Brett E.; Hung, Jui-Long; Baughman, Sarah

    2016-01-01

    Predicting which students enrolled in graduate online education are at-risk for failure is an arduous yet important task for teachers and administrators alike. This research reports on a statistical analysis technique using both static and dynamic variables to determine which students are at-risk and when an intervention could be most helpful…

  17. The effects of simulated bone loss on the implant-abutment assembly and likelihood of fracture: an in vitro study.

    PubMed

    Manzoor, Behzad; Suleiman, Mahmood; Palmer, Richard M

    2013-01-01

    The crestal bone level around a dental implant may influence its strength characteristics by offering protection against mechanical failures. Therefore, the present study investigated the effect of simulated bone loss on modes, loads, and cycles to failure in an in vitro model. Different amounts of bone loss were simulated: 0, 1.5, 3.0, and 4.5 mm from the implant head. Forty narrow-diameter (3.0-mm) implant-abutment assemblies were tested using compressive bending and cyclic fatigue testing. Weibull and accelerated life testing analysis were used to assess reliability and functional life. Statistical analyses were performed using the Fisher-Exact test and the Spearman ranked correlation. Compressive bending tests showed that the level of bone loss influenced the load-bearing capacity of implant-abutment assemblies. Fatigue testing showed that the modes, loads, and cycles to failure had a statistically significant relationship with the level of bone loss. All 16 samples with bone loss of 3.0 mm or more experienced horizontal implant body fractures. In contrast, 14 of 16 samples with 0 and 1.5 mm of bone loss showed abutment and screw fractures. Weibull and accelerated life testing analysis indicated a two-group distribution: the 0- and 1.5-mm bone loss samples had better functional life and reliability than the 3.0- and 4.5-mm samples. Progressive bone loss had a significant effect on modes, loads, and cycles to failure. In addition, bone loss influenced the functional life and reliability of the implant-abutment assemblies. Maintaining crestal bone levels is important in ensuring biomechanical sustainability and predictable long-term function of dental implant assemblies.

  18. A Review of Statistical Failure Time Models with Application of a Discrete Hazard Based Model to 1Cr1Mo-0.25V Steel for Turbine Rotors and Shafts

    PubMed Central

    2017-01-01

    Producing predictions of the probabilistic risks of operating materials for given lengths of time at stated operating conditions requires the assimilation of existing deterministic creep life prediction models (that only predict the average failure time) with statistical models that capture the random component of creep. To date, these approaches have rarely been combined to achieve this objective. The first half of this paper therefore provides a summary review of some statistical models to help bridge the gap between these two approaches. The second half of the paper illustrates one possible assimilation using 1Cr1Mo-0.25V steel. The Wilshire equation for creep life prediction is integrated into a discrete hazard based statistical model—the former being chosen because of its novelty and proven capability in accurately predicting average failure times and the latter being chosen because of its flexibility in modelling the failure time distribution. Using this model it was found that, for example, if this material had been in operation for around 15 years at 823 K and 130 MPa, the chances of failure in the next year is around 35%. However, if this material had been in operation for around 25 years, the chance of failure in the next year rises dramatically to around 80%. PMID:29039773

  19. The geomechanical strength of carbonate rock in Kinta valley, Ipoh, Perak Malaysia

    NASA Astrophysics Data System (ADS)

    Mazlan, Nur Amanina; Lai, Goh Thian; Razib, Ainul Mardhiyah Mohd; Rafek, Abdul Ghani; Serasa, Ailie Sofyiana; Simon, Norbert; Surip, Noraini; Ern, Lee Khai; Mohamed, Tuan Rusli

    2018-04-01

    The stability of both cut rocks and underground openings were influenced by the geomechanical strength of rock materials, while the strength characteristics are influenced by both material characteristics and the condition of weathering. This paper present a systematic approach to quantify the rock material strength characteristics for material failure and material & discontinuities failure by using uniaxial compressive strength, point load strength index and Brazilian tensile strength for carbonate rocks. Statistical analysis of the results at 95 percent confidence level showed that the mean value of compressive strength, point load strength index and Brazilian tensile strength for with material failure and material & discontinuities failure were 76.8 ± 4.5 and 41.2 ± 4.1 MPa with standard deviation of 15.2 and 6.5 MPa, respectively. The point load strength index for material failure and material & discontinuities failure were 3.1 ± 0.2 MPa and 1.8 ± 0.3 MPa with standard deviation of 0.9 and 0.6 MPa, respectively. The Brazilian tensile strength with material failure and material & discontinuities failure were 7.1 ± 0.3 MPa and 4.1 ± 0.3 MPa with standard deviation of 1.4 and 0.6 MPa, respectively. The results of this research revealed that the geomechanical strengths of rock material of carbonate rocks for material & discontinuities failure deteriorates approximately ½ from material failure.

  20. Efficient selective screening for heart failure in elderly men and women from the community: A diagnostic individual participant data meta-analysis

    PubMed Central

    Kievit, Rogier F; Hoes, Arno W; Bots, Michiel L; van Riet, Evelien ES; van Mourik, Yvonne; Bertens, Loes CM; Boonman-de Winter, Leandra JM; den Ruijter, Hester M; Rutten, Frans H

    2018-01-01

    Background Prevalence of undetected heart failure in older individuals is high in the community, with patients being at increased risk of morbidity and mortality due to the chronic and progressive nature of this complex syndrome. An essential, yet currently unavailable, strategy to pre-select candidates eligible for echocardiography to confirm or exclude heart failure would identify patients earlier, enable targeted interventions and prevent disease progression. The aim of this study was therefore to develop and validate such a model that can be implemented clinically. Methods and results Individual patient data from four primary care screening studies were analysed. From 1941 participants >60 years old, 462 were diagnosed with heart failure, according to criteria of the European Society of Cardiology heart failure guidelines. Prediction models were developed in each cohort followed by cross-validation, omitting each of the four cohorts in turn. The model consisted of five independent predictors; age, history of ischaemic heart disease, exercise-related shortness of breath, body mass index and a laterally displaced/broadened apex beat, with no significant interaction with sex. The c-statistic ranged from 0.70 (95% confidence interval (CI) 0.64–0.76) to 0.82 (95% CI 0.78–0.87) at cross-validation and the calibration was reasonable with Observed/Expected ratios ranging from 0.86 to 1.15. The clinical model improved with the addition of N-terminal pro B-type natriuretic peptide with the c-statistic increasing from 0.76 (95% CI 0.70–0.81) to 0.89 (95% CI 0.86–0.92) at cross-validation. Conclusion Easily obtainable patient characteristics can select older men and women from the community who are candidates for echocardiography to confirm or refute heart failure. PMID:29327942

  1. Efficient selective screening for heart failure in elderly men and women from the community: A diagnostic individual participant data meta-analysis.

    PubMed

    Kievit, Rogier F; Gohar, Aisha; Hoes, Arno W; Bots, Michiel L; van Riet, Evelien Es; van Mourik, Yvonne; Bertens, Loes Cm; Boonman-de Winter, Leandra Jm; den Ruijter, Hester M; Rutten, Frans H

    2018-03-01

    Background Prevalence of undetected heart failure in older individuals is high in the community, with patients being at increased risk of morbidity and mortality due to the chronic and progressive nature of this complex syndrome. An essential, yet currently unavailable, strategy to pre-select candidates eligible for echocardiography to confirm or exclude heart failure would identify patients earlier, enable targeted interventions and prevent disease progression. The aim of this study was therefore to develop and validate such a model that can be implemented clinically. Methods and results Individual patient data from four primary care screening studies were analysed. From 1941 participants >60 years old, 462 were diagnosed with heart failure, according to criteria of the European Society of Cardiology heart failure guidelines. Prediction models were developed in each cohort followed by cross-validation, omitting each of the four cohorts in turn. The model consisted of five independent predictors; age, history of ischaemic heart disease, exercise-related shortness of breath, body mass index and a laterally displaced/broadened apex beat, with no significant interaction with sex. The c-statistic ranged from 0.70 (95% confidence interval (CI) 0.64-0.76) to 0.82 (95% CI 0.78-0.87) at cross-validation and the calibration was reasonable with Observed/Expected ratios ranging from 0.86 to 1.15. The clinical model improved with the addition of N-terminal pro B-type natriuretic peptide with the c-statistic increasing from 0.76 (95% CI 0.70-0.81) to 0.89 (95% CI 0.86-0.92) at cross-validation. Conclusion Easily obtainable patient characteristics can select older men and women from the community who are candidates for echocardiography to confirm or refute heart failure.

  2. Ultimate compression after impact load prediction in graphite/epoxy coupons using neural network and multivariate statistical analyses

    NASA Astrophysics Data System (ADS)

    Gregoire, Alexandre David

    2011-07-01

    The goal of this research was to accurately predict the ultimate compressive load of impact damaged graphite/epoxy coupons using a Kohonen self-organizing map (SOM) neural network and multivariate statistical regression analysis (MSRA). An optimized use of these data treatment tools allowed the generation of a simple, physically understandable equation that predicts the ultimate failure load of an impacted damaged coupon based uniquely on the acoustic emissions it emits at low proof loads. Acoustic emission (AE) data were collected using two 150 kHz resonant transducers which detected and recorded the AE activity given off during compression to failure of thirty-four impacted 24-ply bidirectional woven cloth laminate graphite/epoxy coupons. The AE quantification parameters duration, energy and amplitude for each AE hit were input to the Kohonen self-organizing map (SOM) neural network to accurately classify the material failure mechanisms present in the low proof load data. The number of failure mechanisms from the first 30% of the loading for twenty-four coupons were used to generate a linear prediction equation which yielded a worst case ultimate load prediction error of 16.17%, just outside of the +/-15% B-basis allowables, which was the goal for this research. Particular emphasis was placed upon the noise removal process which was largely responsible for the accuracy of the results.

  3. Tensile bond strength of filled and unfilled adhesives to dentin.

    PubMed

    Braga, R R; Cesar, P F; Gonzaga, C C

    2000-04-01

    To determine the tensile bond strength of three filled and two unfilled adhesives applied to bovine dentin. Fragments of the labial dentin of bovine incisors were embedded in PVC cylinders with self-cure acrylic resin, and ground flat using 200 grit and 600 grit sandpaper. The following adhesive systems were tested (n=10): Prime & Bond NT, Prime & Bond NT dual cure, Prime & Bond 2.1, OptiBond Solo and Single Bond. A 3 mm-diameter bonding surface was delimited using a perforated adhesive tape. After etching with 37% phosphoric acid and adhesive application, a resin-based composite truncated cone (TPH, shade A3) was built. Tensile test was performed after 24 hrs storage in distilled water at 37 degrees C. Failure mode was accessed using a x10 magnification stereomicroscope. Weibull statistical analysis revealed significant differences in the characteristic strength between Single Bond and Prime & Bond NT dual cure, and between Single Bond and Prime & Bond 2.1. The Weibull parameter (m) was statistically similar among the five groups. Single Bond and Prime & Bond NT showed areas of dentin cohesive failure in most of the specimens. For OptiBond Solo, Prime & Bond NT dual cure and Prime & Bond 2.1 failure was predominantly adhesive.

  4. The Range Safety Debris Catalog Analysis in Preparation for the Pad Abort One Flight Test

    NASA Technical Reports Server (NTRS)

    Kutty, Prasad M.; Pratt, William D.

    2010-01-01

    The Pad Abort One flight test of the Orion Abort Flight Test Program is currently under development with the goal of demonstrating the capability of the Launch Abort System. In the event of a launch failure, this system will propel the Crew Exploration Vehicle to safety. An essential component of this flight test is range safety, which ensures the security of range assets and personnel. A debris catalog analysis was done as part of a range safety data package delivered to the White Sands Missile Range in New Mexico where the test will be conducted. The analysis discusses the consequences of an overpressurization of the Abort Motor. The resulting structural failure was assumed to create a debris field of vehicle fragments that could potentially pose a hazard to the range. A statistical model was used to assemble the debris catalog of potential propellant fragments. Then, a thermodynamic, energy balance model was applied to the system in order to determine the imparted velocity to these propellant fragments. This analysis was conducted at four points along the flight trajectory to better understand the failure consequences over the entire flight. The methods used to perform this analysis are outlined in detail and the corresponding results are presented and discussed.

  5. Methods for improved forewarning of critical events across multiple data channels

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2007-04-24

    This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.

  6. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  7. Does Bruxism Contribute to Dental Implant Failure? A Systematic Review and Meta-Analysis.

    PubMed

    Zhou, Yi; Gao, Jinxia; Luo, Le; Wang, Yining

    2016-04-01

    Bruxism was usually considered as a contraindication for oral implanting. The causal relationship between bruxism and dental implant failure was remained controversial in existing literatures. This meta-analysis was performed to investigate the relationship between them. This review conducted an electronic systematic literature search in MEDLINE (PubMed) and EmBase in November 2013 without time and language restrictions. Meanwhile, a hand searching for all the relevant references of included studies was also conducted. Study information extraction and methodological quality assessments were accomplished by two reviewers independently. A discussion ensued if any disagreement occurred, and unresolved issues were solved by consulting a third reviewer. Methodological quality was assessed by using the Newcastle-Ottawa Scale tool. Odds ratio (OR) with 95% confidence interval (CI) was pooled to estimate the relative effect of bruxism on dental implant failures. Fixed effects model was used initially; if the heterogeneity was high, random effects model was chosen for meta-analysis. Statistical analyses were carried out by using Review Manager 5.1. In this meta-analysis review, extracted data were classified into two groups based on different units. Units were based on the number of prostheses (group A) and the number of patients (group B). In group A, the total pooled OR of bruxers versus nonbruxers for all subgroups was 4.72 (95% CI: 2.66-8.36, p = .07). In group B, the total pooled OR of bruxers versus nonbruxers for all subgroups was 3.83 (95% CI: 2.12-6.94, p = .22). This meta-analysis was performed to evaluate the relationship between bruxism and dental implant failure. In contrast to nonbruxers, prostheses in bruxers had a higher failure rate. It suggests that bruxism is a contributing factor of causing the occurrence of dental implant technical/biological complications and plays a role in dental implant failure. © 2015 Wiley Periodicals, Inc.

  8. Emergent Irreversibility and Entanglement Spectrum Statistics

    NASA Astrophysics Data System (ADS)

    Chamon, Claudio; Hamma, Alioscia; Mucciolo, Eduardo R.

    2014-06-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than a Hamiltonian one, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wave-function level and offers an alternative route to study quantum chaos and quantum integrability.

  9. Of pacemakers and statistics: the actuarial method extended.

    PubMed

    Dussel, J; Wolbarst, A B; Scott-Millar, R N; Obel, I W

    1980-01-01

    Pacemakers cease functioning because of either natural battery exhaustion (nbe) or component failure (cf). A study of four series of pacemakers shows that a simple extension of the actuarial method, so as to incorporate Normal statistics, makes possible a quantitative differentiation between the two modes of failure. This involves the separation of the overall failure probability density function PDF(t) into constituent parts pdfnbe(t) and pdfcf(t). The approach should allow a meaningful comparison of the characteristics of different pacemaker types.

  10. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  11. Comminuted olecranon fracture fixation with pre-contoured plate: Comparison of composite and cadaver bones

    PubMed Central

    Hamilton Jr, David A; Reilly, Danielle; Wipf, Felix; Kamineni, Srinath

    2015-01-01

    AIM: To determine whether use of a precontoured olecranon plate provides adequate fixation to withstand supraphysiologic force in a comminuted olecranon fracture model. METHODS: Five samples of fourth generation composite bones and five samples of fresh frozen human cadaveric left ulnae were utilized for this study. The cadaveric specimens underwent dual-energy X-ray absorptiometry (DEXA) scanning to quantify the bone quality. The composite and cadaveric bones were prepared by creating a comminuted olecranon fracture and fixed with a pre-contoured olecranon plate with locking screws. Construct stiffness and failure load were measured by subjecting specimens to cantilever bending moments until failure. Fracture site motion was measured with differential variable resistance transducer spanning the fracture. Statistical analysis was performed with two-tailed Mann-Whitney-U test with Monte Carlo Exact test. RESULTS: There was a significant difference in fixation stiffness and strength between the composite bones and human cadaver bones. Failure modes differed in cadaveric and composite specimens. The load to failure for the composite bones (n = 5) and human cadaver bones (n = 5) specimens were 10.67 nm (range 9.40-11.91 nm) and 13.05 nm (range 12.59-15.38 nm) respectively. This difference was statistically significant (P ˂ 0.007, 97% power). Median stiffness for composite bones and human cadaver bones specimens were 5.69 nm/mm (range 4.69-6.80 nm/mm) and 7.55 nm/mm (range 6.31-7.72 nm/mm). There was a significant difference for stiffness (P ˂ 0.033, 79% power) between composite bones and cadaveric bones. No correlation was found between the DEXA results and stiffness. All cadaveric specimens withstood the physiologic load anticipated postoperatively. Catastrophic failure occurred in all composite specimens. All failures resulted from composite bone failure at the distal screw site and not hardware failure. There were no catastrophic fracture failures in the cadaveric specimens. Failure of 4/5 cadaveric specimens was defined when a fracture gap of 2 mm was observed, but 1/5 cadaveric specimens failed due to a failure of the triceps mechanism. All failures occurred at forces greater than that expected in postoperative period prior to healing. CONCLUSION: The pre-contoured olecranon plate provides adequate fixation to withstand physiologic force in a composite bone and cadaveric comminuted olecranon fracture model. PMID:26495247

  12. Prognostic factors in pediatric sepsis study, from the Spanish Society of Pediatric Intensive Care.

    PubMed

    Vila Pérez, David; Jordan, Iolanda; Esteban, Elisabeth; García-Soler, Patricia; Murga, Vega; Bonil, Vanesa; Ortiz, Irene; Flores, Carlos; Bustinza, Amaya; Cambra, Francisco Jose

    2014-02-01

    Sepsis and septic shock represent up to 30% of admitted patients in pediatric intensive care units, with a mortality that can exceed 10%. The objective of this study is to determine the prognostic factors for mortality in sepsis. Multicenter prospective descriptive study with patients (aged 7 days to 18 years) admitted to the pediatric intensive care units for sepsis, between January 2011 and April 2012. Data from 136 patients were collected. Eighty-seven were male (63.9%). The median age was a year and a half (P25-75 0.3-5.5 years). In 41 cases (30.1%), there were underlying diseases. The most common etiology was Neisseria meningitidis (31 cases, 22.8%) followed by Streptococcus pneumoniae (16 patients, 11.8%). Seventeen cases were fatal (12.5%). In the statistical analysis, the factors associated with mortality were nosocomial infection (P = 0.004), hypotension (P <0.001) and heart and kidney failure (P < 0.001 and P = 0.004, respectively). The numbers of leukocytes, neutrophils and platelets on admission were statistically lower in the group that died (P was 0.006, 0.013 and <0.001, respectively). Multivariate analysis showed that multiple organ failure, neutropenia, purpura or coagulopathy and nosocomial infection were independent risk factors for increased mortality (odds ratio: 17, 4.9, 9 and 9.2, respectively). Patients with sepsis and multiorgan failure, especially those with nosocomial infection or the presence of neutropenia or purpura, have a worse prognosis and should be monitored and treated early.

  13. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

    NASA Astrophysics Data System (ADS)

    Faruk, Alfensi

    2018-03-01

    Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

  14. High reliability and high performance of 9xx-nm single emitter laser diodes

    NASA Astrophysics Data System (ADS)

    Bao, L.; Leisher, P.; Wang, J.; Devito, M.; Xu, D.; Grimshaw, M.; Dong, W.; Guan, X.; Zhang, S.; Bai, C.; Bai, J. G.; Wise, D.; Martinsen, R.

    2011-03-01

    Improved performance and reliability of 9xx nm single emitter laser diodes are presented. To date, over 15,000 hours of accelerated multi-cell lifetest reliability data has been collected, with drive currents from 14A to 18A and junction temperatures ranging from 60°C to 110°C. Out of 208 devices, 14 failures have been observed so far. Using established accelerated lifetest analysis techniques, the effects of temperature and power acceleration are assessed. The Mean Time to Failure (MTTF) is determined to be >30 years, for use condition 10W and junction temperature 353K (80°C), with 90% statistical confidence.

  15. SU-E-T-495: Neutron Induced Electronics Failure Rate Analysis for a Single Room Proton Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knutson, N; DeWees, T; Klein, E

    2014-06-01

    Purpose: To determine the failure rate as a function of neutron dose of the range modulator's servo motor controller system (SMCS) while shielded with Borated Polyethylene (BPE) and unshielded in a single room proton accelerator. Methods: Two experimental setups were constructed using two servo motor controllers and two motors. Each SMCS was then placed 30 cm from the end of the plugged proton accelerator applicator. The motor was then turned on and observed from outside of the vault while being irradiated to known neutron doses determined from bubble detector measurements. Anytime the motor deviated from the programmed motion a failuremore » was recorded along with the delivered dose. The experiment was repeated using 9 cm of BPE shielding surrounding the SMCS. Results: Ten SMCS failures were recorded in each experiment. The dose per monitor unit for the unshielded SMCS was 0.0211 mSv/MU and 0.0144 mSv/MU for the shielded SMCS. The mean dose to produce a failure for the unshielded SMCS was 63.5 ± 58.3 mSv versus 17.0 ±12.2 mSv for the shielded. The mean number of MUs between failures were 2297 ± 1891 MU for the unshielded SMCS and 2122 ± 1523 MU for the shielded. A Wilcoxon Signed Ranked test showed the dose between failures were significantly different (P value = 0.044) while the number of MUs between failures were not (P value = 1.000). Statistical analysis determined a SMCS neutron dose of 5.3 mSv produces a 5% chance of failure. Depending on the workload and location of the SMCS, this failure rate could impede clinical workflow. Conclusion: BPE shielding was shown to not reduce the average failure of the SMCS and relocation of the system outside of the accelerator vault was required to lower the failure rate enough to avoid impeding clinical work flow.« less

  16. Development of a GIS-based failure investigation system for highway soil slopes

    NASA Astrophysics Data System (ADS)

    Ramanathan, Raghav; Aydilek, Ahmet H.; Tanyu, Burak F.

    2015-06-01

    A framework for preparation of an early warning system was developed for Maryland, using a GIS database and a collective overlay of maps that highlight highway slopes susceptible to soil slides or slope failures in advance through spatial and statistical analysis. Data for existing soil slope failures was collected from geotechnical reports and field visits. A total of 48 slope failures were recorded and analyzed. Six factors, including event precipitation, geological formation, land cover, slope history, slope angle, and elevation were considered to affect highway soil slope stability. The observed trends indicate that precipitation and poor surface or subsurface drainage conditions are principal factors causing slope failures. 96% of the failed slopes have an open drainage section. A majority of the failed slopes lie in regions with relatively high event precipitation ( P>200 mm). 90% of the existing failures are surficial erosion type failures, and only 1 out of the 42 slope failures is deep rotational type failure. More than half of the analyzed slope failures have occurred in regions having low density land cover. 46% of failures are on slopes with slope angles between 20° and 30°. Influx of more data relating to failed slopes should give rise to more trends, and thus the developed slope management system will aid the state highway engineers in prudential budget allocation and prioritizing different remediation projects based on the literature reviewed on the principles, concepts, techniques, and methodology for slope instability evaluation (Leshchinsky et al., 2015).

  17. Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis.

    PubMed

    Shrout, Patrick E; Rodgers, Joseph L

    2018-01-04

    Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

  18. Electric propulsion reliability: Statistical analysis of on-orbit anomalies and comparative analysis of electric versus chemical propulsion failure rates

    NASA Astrophysics Data System (ADS)

    Saleh, Joseph Homer; Geng, Fan; Ku, Michelle; Walker, Mitchell L. R.

    2017-10-01

    With a few hundred spacecraft launched to date with electric propulsion (EP), it is possible to conduct an epidemiological study of EP's on orbit reliability. The first objective of the present work was to undertake such a study and analyze EP's track record of on orbit anomalies and failures by different covariates. The second objective was to provide a comparative analysis of EP's failure rates with those of chemical propulsion. Satellite operators, manufacturers, and insurers will make reliability- and risk-informed decisions regarding the adoption and promotion of EP on board spacecraft. This work provides evidence-based support for such decisions. After a thorough data collection, 162 EP-equipped satellites launched between January 1997 and December 2015 were included in our dataset for analysis. Several statistical analyses were conducted, at the aggregate level and then with the data stratified by severity of the anomaly, by orbit type, and by EP technology. Mean Time To Anomaly (MTTA) and the distribution of the time to (minor/major) anomaly were investigated, as well as anomaly rates. The important findings in this work include the following: (1) Post-2005, EP's reliability has outperformed that of chemical propulsion; (2) Hall thrusters have robustly outperformed chemical propulsion, and they maintain a small but shrinking reliability advantage over gridded ion engines. Other results were also provided, for example the differentials in MTTA of minor and major anomalies for gridded ion engines and Hall thrusters. It was shown that: (3) Hall thrusters exhibit minor anomalies very early on orbit, which might be indicative of infant anomalies, and thus would benefit from better ground testing and acceptance procedures; (4) Strong evidence exists that EP anomalies (onset and likelihood) and orbit type are dependent, a dependence likely mediated by either the space environment or differences in thrusters duty cycles; (5) Gridded ion thrusters exhibit both infant and wear-out failures, and thus would benefit from a reliability growth program that addresses both these types of problems.

  19. Evaluation program for secondary spacecraft cells: Cycle life test

    NASA Technical Reports Server (NTRS)

    Harkness, J. D.

    1979-01-01

    The service life and storage stability for several storage batteries were determined. The batteries included silver-zinc batteries, nickel-cadmium batteries, and silver-cadmium batteries. The cell performance characteristics and limitations are to be used by spacecraft power systems planners and designers. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is presented.

  20. Significant Factors Related to Failed Pediatric Dental General Anesthesia Appointments at a Hospital-based Residency Program.

    PubMed

    Emhardt, John R; Yepes, Juan F; Vinson, LaQuia A; Jones, James E; Emhardt, John D; Kozlowski, Diana C; Eckert, George J; Maupome, Gerardo

    2017-05-15

    The purposes of this study were to: (1) evaluate the relationship between appointment failure and the factors of age, gender, race, insurance type, day of week, scheduled time of surgery, distance traveled, and weather; (2) investigate reasons for failure; and (3) explore the relationships between the factors and reasons for failure. Electronic medical records were accessed to obtain data for patients scheduled for dental care under general anesthesia from May 2012 to May 2015. Factors were analyzed for relation to appointment failure. Data from 3,513 appointments for 2,874 children were analyzed. Bivariate associations showed statistically significant (P<0.05) relationships between failed appointment and race, insurance type, scheduled time of surgery, distance traveled, snowfall, and temperature. Multinomial regression analysis showed the following associations between factors and the reason for failure (P<0.05): (1) decreased temperature and increased snowfall were associated with weather as reason for failure; (2) the African American population showed an association with family barriers; (3) Hispanic families were less likely to give advanced notice; and (4) the "additional races" category showed an association with fasting violation. Patients who have treatment under general anesthesia face specific barriers to care.

  1. [Development of Hospital Equipment Maintenance Information System].

    PubMed

    Zhou, Zhixin

    2015-11-01

    Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.

  2. Medical cost analysis: application to colorectal cancer data from the SEER Medicare database.

    PubMed

    Bang, Heejung

    2005-10-01

    Incompleteness is a key feature of most survival data. Numerous well established statistical methodologies and algorithms exist for analyzing life or failure time data. However, induced censorship invalidates the use of those standard analytic tools for some survival-type data such as medical costs. In this paper, some valid methods currently available for analyzing censored medical cost data are reviewed. Some cautionary findings under different assumptions are envisioned through application to medical costs from colorectal cancer patients. Cost analysis should be suitably planned and carefully interpreted under various meaningful scenarios even with judiciously selected statistical methods. This approach would be greatly helpful to policy makers who seek to prioritize health care expenditures and to assess the elements of resource use.

  3. Impedance cardiography: a comparison of cardiac output vs waveform analysis for assessing left ventricular systolic dysfunction.

    PubMed

    DeMarzo, Arthur P; Kelly, Russell F; Calvin, James E

    2007-01-01

    Early detection of asymptomatic left ventricular systolic dysfunction (LVSD) is beneficial in managing heart failure. Recent studies have cast doubt on the usefulness of cardiac output as an indicator of LVSD. In impedance cardiography (ICG), the dZ/dt waveform has a systolic wave called the E wave. This study looked at measurements of the amplitude and area of the E wave compared with ICG-derived cardiac output, stroke volume, cardiac index, and stroke index as methods of assessing LVSD. ICG data were obtained from patients (n=26) admitted to a coronary care unit. Clinical LVSD severity was stratified into 4 groups (none, mild, moderate, and severe) based on echocardiography data and standard clinical assessment by a cardiologist blinded to ICG data. Statistical analysis showed that the E wave amplitude and area were better indicators of the level of LVSD than cardiac output, stroke volume, cardiac index, or stroke index. ICG waveform analysis has potential as a simple point-of-care test for detecting LVSD in asymptomatic patients at high risk for developing heart failure and for monitoring LVSD in patients being treated for heart failure.

  4. Failures to replicate blocking are surprising and informative-Reply to Soto (2018).

    PubMed

    Maes, Elisa; Krypotos, Angelos-Miltiadis; Boddez, Yannick; Alfei Palloni, Joaquín Matías; D'Hooge, Rudi; De Houwer, Jan; Beckers, Tom

    2018-04-01

    The blocking effect has inspired numerous associative learning theories and is widely cited in the literature. We recently reported a series of 15 experiments that failed to obtain a blocking effect in rodents. On the basis of those consistent failures, we claimed that there is a lack of insight into the boundary conditions for blocking. In his commentary, Soto (2018) argued that contemporary associative learning theory does provide a specific boundary condition for the occurrence of blocking, namely the use of same- versus different-modality stimuli. Given that in 10 of our 15 experiments same-modality stimuli were used, he claims that our failure to observe a blocking effect is unsurprising. We disagree with that claim, because of theoretical, empirical, and statistical problems with his analysis. We also address 2 other possible reasons for a lack of blocking that are referred to in Soto's (2018) analysis, related to generalization and salience, and dissect the potential importance of both. Although Soto's (2018) analyses raise a number of interesting points, we see more merit in an empirically guided analysis and call for empirical testing of boundary conditions on blocking. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Narrowing the scope of failure prediction using targeted fault load injection

    NASA Astrophysics Data System (ADS)

    Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.

    2018-05-01

    As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.

  6. Fear of failure, psychological stress, and burnout among adolescent athletes competing in high level sport.

    PubMed

    Gustafsson, H; Sagar, S S; Stenling, A

    2017-12-01

    The purpose of this study was to investigate fear of failure in highly competitive junior athletes and the association with psychological stress and burnout. In total 258 athletes (152 males and 108 females) ranged in age from 15 to 19 years (M = 17.4 years, SD = 1.08) participated. Athletes competed in variety of sports including both team and individual sports. Results showed in a variable-oriented approach using regression analyses that one dimension, fear of experiencing shame and embarrassment had a statistically significant effect on perceived psychological stress and one dimension of burnout, reduced sense of accomplishment. However, adopting a person-oriented approach using latent class analysis, we found that athletes with high levels of fear failure on all dimensions scored high on burnout. We also found another class with high scores on burnout. These athletes had high scores on the individual-oriented dimensions of fear of failure and low scores on the other oriented fear of failure dimensions. The findings indicate that fear of failure is related to burnout and psychological stress in athletes and that this association is mainly associated with the individual-oriented dimensions of fear of failure. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. K-nearest neighbors based methods for identification of different gear crack levels under different motor speeds and loads: Revisited

    NASA Astrophysics Data System (ADS)

    Wang, Dong

    2016-03-01

    Gears are the most commonly used components in mechanical transmission systems. Their failures may cause transmission system breakdown and result in economic loss. Identification of different gear crack levels is important to prevent any unexpected gear failure because gear cracks lead to gear tooth breakage. Signal processing based methods mainly require expertize to explain gear fault signatures which is usually not easy to be achieved by ordinary users. In order to automatically identify different gear crack levels, intelligent gear crack identification methods should be developed. The previous case studies experimentally proved that K-nearest neighbors based methods exhibit high prediction accuracies for identification of 3 different gear crack levels under different motor speeds and loads. In this short communication, to further enhance prediction accuracies of existing K-nearest neighbors based methods and extend identification of 3 different gear crack levels to identification of 5 different gear crack levels, redundant statistical features are constructed by using Daubechies 44 (db44) binary wavelet packet transform at different wavelet decomposition levels, prior to the use of a K-nearest neighbors method. The dimensionality of redundant statistical features is 620, which provides richer gear fault signatures. Since many of these statistical features are redundant and highly correlated with each other, dimensionality reduction of redundant statistical features is conducted to obtain new significant statistical features. At last, the K-nearest neighbors method is used to identify 5 different gear crack levels under different motor speeds and loads. A case study including 3 experiments is investigated to demonstrate that the developed method provides higher prediction accuracies than the existing K-nearest neighbors based methods for recognizing different gear crack levels under different motor speeds and loads. Based on the new significant statistical features, some other popular statistical models including linear discriminant analysis, quadratic discriminant analysis, classification and regression tree and naive Bayes classifier, are compared with the developed method. The results show that the developed method has the highest prediction accuracies among these statistical models. Additionally, selection of the number of new significant features and parameter selection of K-nearest neighbors are thoroughly investigated.

  8. Effects of Long Term Thermal Exposure on Chemically Pure (CP) Titanium Grade 2 Room Temperature Tensile Properties and Microstructure

    NASA Technical Reports Server (NTRS)

    Ellis, David L.

    2007-01-01

    Room temperature tensile testing of Chemically Pure (CP) Titanium Grade 2 was conducted for as-received commercially produced sheet and following thermal exposure at 550 and 650 K for times up to 5,000 h. No significant changes in microstructure or failure mechanism were observed. A statistical analysis of the data was performed. Small statistical differences were found, but all properties were well above minimum values for CP Ti Grade 2 as defined by ASTM standards and likely would fall within normal variation of the material.

  9. ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH

    PubMed Central

    Swindell, William R.

    2009-01-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875

  10. Accelerated failure time models provide a useful statistical framework for aging research.

    PubMed

    Swindell, William R

    2009-03-01

    Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.

  11. AGARD Flight Test Techniques Series. Volume 14. Introduction to Flight Test Engineering (Introduction a la Technique d’essais en vol)

    DTIC Science & Technology

    1995-09-01

    path and aircraft attitude and other flight or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis...Signal filtering Image processing of video and radar data Parameter identification Statistical analysis Power spectral density Fast Fourier Transform...airspeeds both fast and slow, altitude, load factor both above and below 1g, centers of gravity (fore and aft), and with system/subsystem failures. Whether

  12. Computerized system for assessing heart rate variability.

    PubMed

    Frigy, A; Incze, A; Brânzaniuc, E; Cotoi, S

    1996-01-01

    The principal theoretical, methodological and clinical aspects of heart rate variability (HRV) analysis are reviewed. This method has been developed over the last 10 years as a useful noninvasive method of measuring the activity of the autonomic nervous system. The main components and the functioning of the computerized rhythm-analyzer system developed by our team are presented. The system is able to perform short-term (maximum 20 minutes) time domain HRV analysis and statistical analysis of the ventricular rate in any rhythm, particularly in atrial fibrillation. The performances of our system are demonstrated by using the graphics (RR histograms, delta RR histograms, RR scattergrams) and the statistical parameters resulted from the processing of three ECG recordings. These recordings are obtained from a normal subject, from a patient with advanced heart failure, and from a patient with atrial fibrillation.

  13. Identifying factors that predict the choice and success rate of radial artery catheterisation in contemporary real world cardiology practice: a sub-analysis of the PREVAIL study data.

    PubMed

    Pristipino, Christian; Roncella, Adriana; Trani, Carlo; Nazzaro, Marco S; Berni, Andrea; Di Sciascio, Germano; Sciahbasi, Alessandro; Musarò, Salvatore Donato; Mazzarotto, Pietro; Gioffrè, Gaetano; Speciale, Giulio

    2010-06-01

    To assess: the reasons behind an operator choosing to perform radial artery catheterisation (RAC) as against femoral arterial catheterisation, and to explore why RAC may fail in the real world. A pre-determined analysis of PREVAIL study database was performed. Relevant data were collected in a prospective, observational survey of 1,052 consecutive patients undergoing invasive cardiovascular procedures at nine Italian hospitals over a one month observation period. By multivariate analysis, the independent predictors of RAC choice were having the procedure performed: (1) at a high procedural volume centre; and (2) by an operator who performs a high volume of radial procedures; clinical variables played no statistically significant role. RAC failure was predicted independently by (1) a lower operator propensity to use RAC; and (2) the presence of obstructive peripheral artery disease. A 10-fold lower rate of RAC failure was observed among operators who perform RAC for > 85% of their personal caseload than among those who use RAC < 25% of the time (3.8% vs. 33.0%, respectively); by receiver operator characteristic (ROC) analysis, no threshold value for operator RAC volume predicted RAC failure. A routine RAC in all-comers is superior to a selective strategy in terms of feasibility and success rate.

  14. Student Achievement in Undergraduate Statistics: The Potential Value of Allowing Failure

    ERIC Educational Resources Information Center

    Ferrandino, Joseph A.

    2016-01-01

    This article details what resulted when I re-designed my undergraduate statistics course to allow failure as a learning strategy and focused on achievement rather than performance. A variety of within and between sample t-tests are utilized to determine the impact of unlimited test and quiz opportunities on student learning on both quizzes and…

  15. Effect of Surface Treatment, Silane, and Universal Adhesive on Microshear Bond Strength of Nanofilled Composite Repairs.

    PubMed

    Fornazari, I A; Wille, I; Meda, E M; Brum, R T; Souza, E M

     The aim of this study was to evaluate the effect of surface treatment and universal adhesive on the microshear bond strength of nanoparticle composite repairs.  One hundred and forty-four specimens were built with a nanofilled composite (Filtek Supreme Ultra, 3M ESPE). The surfaces of all the specimens were polished with SiC paper and stored in distilled water at 37°C for 14 days. Half of the specimens were then air abraded with Al 2 O 3 particles and cleaned with phosphoric acid. Polished specimens (P) and polished and air-abraded specimens (A), respectively, were randomly divided into two sets of six groups (n=12) according to the following treatments: hydrophobic adhesive only (PH and AH, respectively), silane and hydrophobic adhesive (PCH, ACH), methacryloyloxydecyl dihydrogen phosphate (MDP)-containing silane and hydrophobic adhesive (PMH, AMH), universal adhesive only (PU, AU), silane and universal adhesive (PCU, ACU), and MDP-containing silane and universal adhesive (PMU, AMU). A cylinder with the same composite resin (1.1-mm diameter) was bonded to the treated surfaces to simulate the repair. After 48 hours, the specimens were subjected to microshear testing in a universal testing machine. The failure area was analyzed under an optical microscope at 50× magnification to identify the failure type, and the data were analyzed by three-way analysis of variance and the Games-Howell test (α=0.05).  The variables "surface treatment" and "adhesive" showed statistically significant differences for p<0.05. The highest mean shear bond strength was found in the ACU group but was not statistically different from the means for the other air-abraded groups except AH. All the polished groups except PU showed statistically significant differences compared with the air-abraded groups. The PU group had the highest mean among the polished groups. Cohesive failure was the most frequent failure mode in the air-abraded specimens, while mixed failure was the most common mode in the polished specimens.  While air abrasion with Al 2 O 3 particles increased the repair bond strength of the nanoparticle composite, the use of MDP-containing silane did not lead to a statistically significant increase in bond strength. Silane-containing universal adhesive on its own was as effective as any combination of silane and adhesive, particularly when applied on air-abraded surfaces.

  16. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    NASA Astrophysics Data System (ADS)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  17. Improving the performance of a filling line based on simulation

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  18. Comparative analysis of positive and negative attitudes toward statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  19. Economic impact of heart failure according to the effects of kidney failure.

    PubMed

    Sicras Mainar, Antoni; Navarro Artieda, Ruth; Ibáñez Nolla, Jordi

    2015-01-01

    To evaluate the use of health care resources and their cost according to the effects of kidney failure in heart failure patients during 2-year follow-up in a population setting. Observational retrospective study based on a review of medical records. The study included patients ≥ 45 years treated for heart failure from 2008 to 2010. The patients were divided into 2 groups according to the presence/absence of KF. Main outcome variables were comorbidity, clinical status (functional class, etiology), metabolic syndrome, costs, and new cases of cardiovascular events and kidney failure. The cost model included direct and indirect health care costs. Statistical analysis included multiple regression models. The study recruited 1600 patients (prevalence, 4.0%; mean age 72.4 years; women, 59.7%). Of these patients, 70.1% had hypertension, 47.1% had dyslipidemia, and 36.2% had diabetes mellitus. We analyzed 433 patients (27.1%) with kidney failure and 1167 (72.9%) without kidney failure. Patients with kidney failure were associated with functional class III-IV (54.1% vs 40.8%) and metabolic syndrome (65.3% vs 51.9%, P<.01). The average unit cost was €10,711.40. The corrected cost in the presence of kidney failure was €14,868.20 vs €9,364.50 (P=.001). During follow-up, 11.7% patients developed ischemic heart disease, 18.8% developed kidney failure, and 36.1% developed heart failure exacerbation. Comorbidity associated with heart failure is high. The presence of kidney failure increases the use of health resources and leads to higher costs within the National Health System. Copyright © 2014 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  20. Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability

    DTIC Science & Technology

    2015-07-01

    12th International Conference on Applications of Statistics and Probability in Civil Engineering, ICASP12 Vancouver, Canada, July 12-15, 2015...Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...criterion is usually the failure probability . In this paper, we examine the buffered failure probability as an attractive alternative to the failure

  1. The Number of Recalled Leads is Highly Predictive of Lead Failure: Results From the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS").

    PubMed

    Kersten, Daniel J; Yi, Jinju; Feldman, Alyssa M; Brahmbhatt, Kunal; Asheld, Wilbur J; Germano, Joseph; Islam, Shahidul; Cohen, Todd J

    2016-12-01

    The purpose of this study was to determine if implantation of multiple recalled defibrillator leads is associated with an increased risk of lead failure. The authors of the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS") have previously reported a relationship between recalled lead status, lead failure, and patient mortality. This substudy analyzes the relationship in a smaller subset of patients who received more than one recalled lead. The specific effects of having one or more recalled leads have not been previously examined. This study analyzed lead failure and mortality of 3802 patients in PAIDLESS and compared outcomes with respect to the number of recalled leads received. PAIDLESS includes all patients at Winthrop University Hospital who underwent defibrillator lead implantation between February 1, 1996 and December 31, 2011. Patients with no recalled ICD leads, one recalled ICD lead, and two recalled ICD leads were compared using the Kaplan-Meier method and log-rank test. Sidak adjustment method was used to correct for multiple comparisons. All calculations were performed using SAS 9.4. P-values <.05 were considered statistically significant. This study included 4078 total ICD leads implanted during the trial period. There were 2400 leads (59%) in the no recalled leads category, 1620 leads (40%) in the one recalled lead category, and 58 leads (1%) in the two recalled leads category. No patient received more than two recalled leads. Of the leads categorized in the two recalled leads group, 12 experienced lead failures (21%), which was significantly higher (P<.001) than in the no recalled leads group (60 failures, 2.5%) and one recalled lead group (81 failures; 5%). Multivariable Cox's regression analysis found a total of six significant predictive variables for lead failure including the number of recalled leads (P<.001 for one and two recalled leads group). The number of recalled leads is highly predictive of lead failure. Lead-based multivariable Cox's regression analysis produced a total of six predictive variable categories for lead failure, one of which was the number of recalled leads. Kaplan-Meier analysis showed that the leads in the two recalled leads category failed faster than both the no recalled lead and one recalled lead groups. The greater the number of recalled leads to which patients are exposed, the greater the risk of lead failure.

  2. Evaluating the best time to intervene acute liver failure in rat models induced by d-galactosamine.

    PubMed

    Éboli, Lígia Patrícia de Carvalho Batista; Netto, Alcides Augusto Salzedas; Azevedo, Ramiro Antero de; Lanzoni, Valéria Pereira; Paula, Tatiana Sugayama de; Goldenberg, Alberto; Gonzalez, Adriano Miziara

    2016-12-01

    To describe an animal model for acute liver failure by intraperitoneal d-galactosamine injections in rats and to define when is the best time to intervene through King's College and Clichy´s criteria evaluation. Sixty-one Wistar female rats were distributed into three groups: group 1 (11 rats received 1.4 g/kg of d-galactosamine intraperitoneally and were observed until they died); group 2 (44 rats received a dose of 1.4 g/kg of d-galactosamine and blood and histological samples were collected for analysis at 12 , 24, 48 , 72 and 120 hours after the injection); and the control group as well (6 rats) . Twelve hours after applying d-galactosamine, AST/ALT, bilirubin, factor V, PT and INR were already altered. The peak was reached at 48 hours. INR > 6.5 was found 12 hours after the injection and factor V < 30% after 24 hours. All the laboratory variables presented statistical differences, except urea (p = 0.758). There were statistical differences among all the histological variables analyzed. King's College and Clichy´s criteria were fulfilled 12 hours after the d-galactosamine injection and this time may represent the best time to intervene in this acute liver failure animal model.

  3. Bond strength comparison of 2 self-etching primers over a 3-month storage period.

    PubMed

    Trites, Brian; Foley, Timothy F; Banting, David

    2004-12-01

    The purpose of this in vitro study was to evaluate the shear-peel bond strength of 2 self-etching primer systems, Transbond Plus (3M/ Unitek, Monrovia, Calif) and First Step (Reliance Orthodontic Products, Itasca, Ill), with their respective adhesives, and compare them with a control adhesive system (Transbond XT, 3M/ Unitek) over a 3-month period. Two hundred seventy extracted human premolars were obtained and randomly divided into 9 groups of 30 teeth. Metal orthodontic brackets were bonded to the enamel, and each adhesive group was stored for 24 horrs (T1), 30 days (T2), or 3 months (T3) in deionized water at 37 degrees C. All bonded specimens were thermocycled at 10 degrees C and 50 degrees C for 24 hours before debonding. Brackets were debonded by using a shear-peel load on a testing machine at a cross-head speed of 2 mm/min. Bond failure was also evaluated. The shear-peel bond strengths of the 3 bonding systems were clinically acceptable with the possible exception of First Step at 30-day storage. Repeated measures analysis of variance showed a statistically significant (P < .0001) difference in mean bond strengths between the 3 adhesive systems. The shear-peel bond strength of the adhesives over the 3 time intervals showed statistically significant (P = .005) changes. In each group, there were statistically significant differences in shear-peel bond strength between time intervals T1-T2 and T2-T3 for Transbond Plus and T2-T3 for First Step. The change in mean shear-peel bond strength of the 3 adhesives demonstrated a consistent pattern of behavior over the 3 storage intervals. The lowest mean shear-peel bond strength values were noted at the 30-day storage. Bond failure analysis (adhesive remnant index) demonstrated mainly cohesive bond failures.

  4. Getting the numbers right: statistical mischief and racial profiling in heart failure research.

    PubMed

    Kahn, Jonathan

    2003-01-01

    The claim that blacks die from heart failure at a rate twice that of whites is informing efforts to develop and market the drug BiDil, which is currently undergoing clinical trials to be approved by the FDA as the first drug ever specified to treat African Americans--and only African Americans--for heart failure. The drug and its companion statistic have since come to play prominent roles in debates about so-called "racial profiling" in medicine and the legitimacy of using social categories of race in biomedical research. Nonetheless, this statistic is wrong. The most current data available place the black:white mortality ratio for heart failure at approximately 1.1:1. The article tells the story of attempts to get to the source of the supposed 2:1 mortality ratio and explores some of the implications of the acceptance of these erroneous data, both for the allocation of resources to combat disease and for our broader understanding of the nature and meaning of race.

  5. Virological and immunological failure of HAART and associated risk factors among adults and adolescents in the Tigray region of Northern Ethiopia.

    PubMed

    Hailu, Genet Gebrehiwet; Hagos, Dawit Gebregziabher; Hagos, Amlsha Kahsay; Wasihun, Araya Gebreyesus; Dejene, Tsehaye Asmelash

    2018-01-01

    Human immunodeficiency virus/Acquired immunodeficiency syndrome associated morbidity and mortality has reduced significantly since the introduction of highly active antiretroviral therapy. As a result of increasing access to highly active antiretroviral therapy, the survival and quality of life of the patients has significantly improved globally. Despite this promising result, regular monitoring of people on antiretroviral therapy is recommended to ensure whether there is an effective treatment response or not. This study was designed to assess virological and immunological failure of highly active antiretroviral therapy users among adults and adolescents in the Tigray region of Northern Ethiopia, where scanty data are available. A retrospective follow up study was conducted from September 1 to December 30, 2016 to assess the magnitude and factors associated with virological and immunological failure among 260 adults and adolescents highly active antiretroviral therapy users who started first line ART between January 1, 2008 to March 1, 2016. A standardized questionnaire was used to collect socio-demographic and clinical data. SPSS Version21 statistical software was used for analysis. Bivariate and multivariate logistic regression analyses were conducted to identify factors associated to virological and immunological failure. Statistical association was declared significant if p-value was ≤ 0.05. A total of 30 (11.5%) and 17 (6.5%) participants experienced virological and immunological failure respectively in a median time of 36 months of highly active antiretroviral therapy. Virological failure was associated with non-adherence to medications, aged < 40 years old, having CD4+ T-cells count < 250 cells/μL and male gender. Similarly, immunological failure was associated with non-adherence, tuberculosis co-infection and Human immunodeficiency virus RNA ≥1000 copies/mL. The current result shows that immunological and virological failure is a problem in a setting where highly active antiretroviral therapy has been largely scale up. The problem is more in patients with poor adherence. This will in turn affect the global targets of 90% viral suppression by 2020. This may indicate the need for more investment and commitment to improving patient adherence in the study area.

  6. Virological and immunological failure of HAART and associated risk factors among adults and adolescents in the Tigray region of Northern Ethiopia

    PubMed Central

    Hailu, Genet Gebrehiwet; Hagos, Dawit Gebregziabher; Hagos, Amlsha Kahsay; Dejene, Tsehaye Asmelash

    2018-01-01

    Background Human immunodeficiency virus/Acquired immunodeficiency syndrome associated morbidity and mortality has reduced significantly since the introduction of highly active antiretroviral therapy. As a result of increasing access to highly active antiretroviral therapy, the survival and quality of life of the patients has significantly improved globally. Despite this promising result, regular monitoring of people on antiretroviral therapy is recommended to ensure whether there is an effective treatment response or not. This study was designed to assess virological and immunological failure of highly active antiretroviral therapy users among adults and adolescents in the Tigray region of Northern Ethiopia, where scanty data are available. Methods A retrospective follow up study was conducted from September 1 to December 30, 2016 to assess the magnitude and factors associated with virological and immunological failure among 260 adults and adolescents highly active antiretroviral therapy users who started first line ART between January 1, 2008 to March 1, 2016. A standardized questionnaire was used to collect socio-demographic and clinical data. SPSS Version21 statistical software was used for analysis. Bivariate and multivariate logistic regression analyses were conducted to identify factors associated to virological and immunological failure. Statistical association was declared significant if p-value was ≤ 0.05. Result A total of 30 (11.5%) and 17 (6.5%) participants experienced virological and immunological failure respectively in a median time of 36 months of highly active antiretroviral therapy. Virological failure was associated with non-adherence to medications, aged < 40 years old, having CD4+ T-cells count < 250 cells/μL and male gender. Similarly, immunological failure was associated with non-adherence, tuberculosis co-infection and Human immunodeficiency virus RNA ≥1000 copies/mL. Conclusions The current result shows that immunological and virological failure is a problem in a setting where highly active antiretroviral therapy has been largely scale up. The problem is more in patients with poor adherence. This will in turn affect the global targets of 90% viral suppression by 2020. This may indicate the need for more investment and commitment to improving patient adherence in the study area. PMID:29715323

  7. Satellite GN and C Anomaly Trends

    NASA Technical Reports Server (NTRS)

    Robertson, Brent; Stoneking, Eric

    2003-01-01

    On-orbit anomaly records for satellites launched from 1990 through 2001 are reviewed to determine recent trends of un-manned space mission critical failures. Anomalies categorized by subsystems show that Guidance, Navigation and Control (GN&C) subsystems have a high number of anomalies that result in a mission critical failure when compared to other subsystems. A mission critical failure is defined as a premature loss of a satellite or loss of its ability to perform its primary mission during its design life. The majority of anomalies are shown to occur early in the mission, usually within one year from launch. GN&C anomalies are categorized by cause and equipment type involved. A statistical analysis of the data is presented for all anomalies compared with the GN&C anomalies for various mission types, orbits and time periods. Conclusions and recommendations are presented for improving mission success and reliability.

  8. Multinational assessment of accuracy of equations for predicting risk of kidney failure: a meta-analysis

    PubMed Central

    Tangri, Navdeep; Grams, Morgan E.; Levey, Andrew S.; Coresh, Josef; Appel, Lawrence; Astor, Brad C.; Chodick, Gabriel; Collins, Allan J.; Djurdjev, Ognjenka; Elley, C. Raina; Evans, Marie; Garg, Amit X.; Hallan, Stein I.; Inker, Lesley; Ito, Sadayoshi; Jee, Sun Ha; Kovesdy, Csaba P.; Kronenberg, Florian; Lambers Heerspink, Hiddo J.; Marks, Angharad; Nadkarni, Girish N.; Navaneethan, Sankar D.; Nelson, Robert G.; Titze, Stephanie; Sarnak, Mark J.; Stengel, Benedicte; Woodward, Mark; Iseki, Kunitoshi

    2016-01-01

    Importance Identifying patients at risk of chronic kidney disease (CKD) progression may facilitate more optimal nephrology care. Kidney failure risk equations (KFREs) were previously developed and validated in two Canadian cohorts. Validation in other regions and in CKD populations not under the care of a nephrologist is needed. Objective To evaluate the accuracy of the KFREs across different geographic regions and patient populations through individual-participant data meta-analysis. Data Sources Thirty-one cohorts, including 721,357 participants with CKD Stages 3–5 in over 30 countries spanning 4 continents, were studied. These cohorts collected data from 1982 through 2014. Study Selection Cohorts participating in the CKD Prognosis Consortium with data on end-stage renal disease. Data Extraction and Synthesis Data were obtained and statistical analyses were performed between July 2012 and June 2015. Using the risk factors from the original KFREs, cohort-specific hazard ratios were estimated, and combined in meta-analysis to form new “pooled” KFREs. Original and pooled equation performance was compared, and the need for regional calibration factors was assessed. Main Outcome and Measure Kidney failure (treatment by dialysis or kidney transplantation). Results During a median follow-up of 4 years, 23,829 cases of kidney failure were observed. The original KFREs achieved excellent discrimination (ability to differentiate those who developed kidney failure from those who did not) across all cohorts (overall C statistic, 0.90 (95% CI 0.89–0.92) at 2 years and 0.88 (95% CI 0.86–0.90) at 5 years); discrimination in subgroups by age, race, and diabetes status was similar. There was no improvement with the pooled equations. Calibration (the difference between observed and predicted risk) was adequate in North American cohorts, but the original KFREs overestimated risk in some non-North American cohorts. Addition of a calibration factor that lowered the baseline risk by 32.9% at 2 years and 16.5% at 5 years improved the calibration in 12/15 and 10/13 non-North American cohorts at 2 and 5 years, respectively (p=0.04 and p=0.02). Conclusions and Relevance KFREs developed in a Canadian population showed high discrimination and adequate calibration when validated in 31 multinational cohorts. However, in some regions the addition of a calibration factor may be necessary. PMID:26757465

  9. Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Go, Susie

    2006-01-01

    A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.

  10. Characteristics and intraoperative treatments associated with head and neck free tissue transfer complications and failures.

    PubMed

    Hand, William R; McSwain, Julie R; McEvoy, Matthew D; Wolf, Bethany; Algendy, Abdalrahman A; Parks, Matthew D; Murray, John L; Reeves, Scott T

    2015-03-01

    To investigate the association between perioperative patient characteristics and treatment modalities (eg, vasopressor use and volume of fluid administration) with complications and failure rates in patients undergoing head and neck free tissue transfer (FTT). A retrospective review of medical records. Perioperative hospitalization for head and neck FTT at 1 tertiary care medical center between January 1, 2009, and October 31, 2011. Consecutive patients (N=235) who underwent head and neck FTT. Demographic, patient characteristic, and intraoperative data were extracted from medical records. Complication and failure rates within the first 30 days were collected In a multivariate analysis controlling for age, sex, ethnicity, reason for receiving flap, and type and volume of fluid given, perioperative complication was significantly associated with surgical blood loss (P=.019; 95% confidence interval [CI], 1.01-1.16), while the rate of intraoperative fluid administration did not reach statistical significance (P=.06; 95% CI, 0.99-1.28). In a univariate analysis, FTT failure was significantly associated with reason for surgery (odds ratio, 5.40; P=.03; 95% CI, 1.69-17.3) and preoperative diagnosis of coronary artery disease (odds ratio, 3.60; P=.03; 95% CI, 1.16-11.2). Intraoperative vasopressor administration was not associated with either FTT complication or failure rate. FTT complications were associated with surgical blood loss but not the use of vasoactive drugs. For patients undergoing FTT, judicious monitoring of blood loss may help stratify the risk of complication and failure. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.

  11. Visualizing collaborative electronic health record usage for hospitalized patients with heart failure.

    PubMed

    Soulakis, Nicholas D; Carson, Matthew B; Lee, Young Ji; Schneider, Daniel H; Skeehan, Connor T; Scholtens, Denise M

    2015-03-01

    To visualize and describe collaborative electronic health record (EHR) usage for hospitalized patients with heart failure. We identified records of patients with heart failure and all associated healthcare provider record usage through queries of the Northwestern Medicine Enterprise Data Warehouse. We constructed a network by equating access and updates of a patient's EHR to a provider-patient interaction. We then considered shared patient record access as the basis for a second network that we termed the provider collaboration network. We calculated network statistics, the modularity of provider interactions, and provider cliques. We identified 548 patient records accessed by 5113 healthcare providers in 2012. The provider collaboration network had 1504 nodes and 83 998 edges. We identified 7 major provider collaboration modules. Average clique size was 87.9 providers. We used a graph database to demonstrate an ad hoc query of our provider-patient network. Our analysis suggests a large number of healthcare providers across a wide variety of professions access records of patients with heart failure during their hospital stay. This shared record access tends to take place not only in a pairwise manner but also among large groups of providers. EHRs encode valuable interactions, implicitly or explicitly, between patients and providers. Network analysis provided strong evidence of multidisciplinary record access of patients with heart failure across teams of 100+ providers. Further investigation may lead to clearer understanding of how record access information can be used to strategically guide care coordination for patients hospitalized for heart failure. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  12. The Study of Cognitive Function and Related Factors in Patients With Heart Failure

    PubMed Central

    Ghanbari, Atefeh; Moaddab, Fatemeh; Salari, Arsalan; Kazemnezhad Leyli, Ehsan; Sedghi Sabet, Mitra; Paryad, Ezzat

    2013-01-01

    Background: Cognitive impairment is increasingly recognized as a common adverse consequence of heart failure. Both Heart failure and cognitive impairment are associated with frequent hospitalization and increased mortality, particularly when they occur simultaneously. Objectives: To determine cognitive function and related factors in patients with heart failure. Materials and Methods: In this descriptive cross-sectional study, we assessed 239 patients with heart failure. Data were collected by Mini Mental status Examination, Charlson comorbidity index and NYHA classification system. Data were analyzed using descriptive statistics, Kolmogorov-Smirnov test, chi-square test, t-test and logistic regression analysis. Results: The mean score of cognitive function was 21.68 ± 4.51. In total, 155 patients (64.9%) had cognitive impairment. Significant associations were found between the status of cognitive impairment and gender (P < 0.002), education level (P < 0.000), living location (P < 0.000), marital status (P < 0.03), living arrangement (P < 0.001 ), employment status (P < 0.000), income (P < 0.02), being the head of family (P < 0.03), the family size (P < 0.02), having a supplemental insurance (P < 0.003) and the patient’s comorbidities (P < 0.02). However, in logistic regression analysis, only education and supplementary insurance could predict cognitive status which indicates that patients with supplementary insurance and higher education levels were more likely to maintain optimal cognitive function. Conclusions: More than a half of the subjects had cognitive impairment. As the level of patients cognitive functioning affects their behaviors and daily living activities, it is recommended that patients with heart failure should be assessed for their cognitive functioning. PMID:25414874

  13. Characteristics and Intraoperative Treatments Associated with Head and Neck Free Tissue Transfer Complications and Failures

    PubMed Central

    Hand, William R.; McSwain, Julie R.; McEvoy, Matthew D.; Wolf, Bethany; Algendy, Abdalrahman A.; Parks, Matthew D.; Murray, John L.; Reeves, Scott T.

    2015-01-01

    Objective To investigate the association between perioperative patient characteristics and treatment modalities (eg, vasopressor use and volume of fluid administration) with complications and failure rates in patients undergoing head and neck free tissue transfer (FTT). Study Design A retrospective review of medical records. Setting Perioperative hospitalization for head and neck FTT at 1 tertiary care medical center between January 1, 2009, and October 31, 2011. Subjects and Methods Consecutive patients (N = 235) who underwent head and neck FTT. Demographic, patient characteristic, and intraoperative data were extracted from medical records. Complication and failure rates within the first 30 days were collected Results In a multivariate analysis controlling for age, sex, ethnicity, reason for receiving flap, and type and volume of fluid given, perioperative complication was significantly associated with surgical blood loss (P = .019; 95% confidence interval [CI], 1.01-1.16), while the rate of intraoperative fluid administration did not reach statistical significance (P = .06; 95% CI, 0.99-1.28). In a univariate analysis, FTT failure was significantly associated with reason for surgery (odds ratio, 5.40; P = .03; 95% CI, 1.69-17.3) and preoperative diagnosis of coronary artery disease (odds ratio, 3.60; P = .03; 95% CI, 1.16-11.2). Intraoperative vasopressor administration was not associated with either FTT complication or failure rate. Conclusions FTT complications were associated with surgical blood loss but not the use of vasoactive drugs. For patients undergoing FTT, judicious monitoring of blood loss may help stratify the risk of complication and failure. PMID:25550221

  14. Comparison of the compressive strength of 3 different implant design systems.

    PubMed

    Pedroza, Jose E; Torrealba, Ysidora; Elias, Augusto; Psoter, Walter

    2007-01-01

    The aims of this study were twofold: to compare the static compressive strength at the implant-abutment interface of 3 design systems and to describe the implant abutment connection failure mode. A stainless steel holding device was designed to align the implants at 30 degrees with respect to the y-axis. Sixty-nine specimens were used, 23 for each system. A computer-controlled universal testing machine (MTS 810) applied static compression loading by a unidirectional vertical piston until failure. Specimens were evaluated macroscopically for longitudinal displacement, abutment looseness, and screw and implant fracture. Data were analyzed by analysis of variance (ANOVA). The mean compressive strength for the Unipost system was 392.5 psi (SD +/-40.9), for the Spline system 342.8 psi (SD+/-25.8), and for the Screw-Vent system 269.1 psi (SD+/-30.7). The Unipost implant-abutment connection demonstrated a statistically significant superior mechanical stability (P < or = .009) compared with the Spline implant system. The Spline implant system showed a statistically significant higher compressive strength than the Screw-Vent implant system (P < or =.009). Regarding failure mode, the Unipost system consistently broke at the same site, while the other systems failed at different points of the connection. The Unipost system demonstrated excellent fracture resistance to compressive forces; this resistance may be attributed primarily to the diameter of the abutment screw and the 2.5 mm counter bore, representing the same and a unique piece of the implant. The Unipost implant system demonstrated a statistically significant superior compressive strength value compared with the Spline and Screw-Vent systems, at a 30 degrees angulation.

  15. Gaining power and precision by using model-based weights in the analysis of late stage cancer trials with substantial treatment switching.

    PubMed

    Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R

    2016-04-30

    In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.

  16. Feasibility and acceptability of a self-measurement using a portable bioelectrical impedance analysis, by the patient with chronic heart failure, in acute decompensated heart failure.

    PubMed

    Huguel, Benjamin; Vaugrenard, Thibaud; Saby, Ludivine; Benhamou, Lionel; Arméro, Sébastien; Camilleri, Élise; Langar, Aida; Alitta, Quentin; Grino, Michel; Retornaz, Frédérique

    2018-06-01

    Chronic heart failure (CHF) is a major public health matter. Mainly affecting the elderly, it is responsible for a high rate of hospitalization due to the frequency of acute heart failure (ADHF). This represents a disabling pathology for the patient and very costly for the health care system. Our study is designed to assess a connected and portable bioelectrical impedance analysis (BIA) that could reduce these hospitalizations by preventing early ADHF. This prospective study included patients hospitalized in cardiology for ADHF. Patients achieved 3 self-measurements using the BIA during their hospitalization and answered a questionnaire evaluating the acceptability of this self-measurement. The results of these measures were compared with the clinical, biological and echocardiographic criteria of patients at the same time. Twenty-three patients were included, the self-measurement during the overall duration of the hospitalization was conducted autonomously by more than 80% of the patients. The acceptability (90%) for the use of the portable BIA was excellent. Some correlations were statistically significant, such as the total water difference to the weight difference (p=0.001). There were common trends between the variation of impedance analysis measures and other evaluation criteria. The feasibility and acceptability of a self-measurement of bioelectrical impedance analysis by the patient in AHF opens up major prospects in the management of monitoring patients in CHF. The interest of this tool is the prevention of ADHF leading to hospitalization or re-hospitalizations now requires to be presented by new studies.

  17. Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.

    PubMed

    Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe

    2017-12-27

    The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.

  18. Effect of the infrastructure material on the failure behavior of prosthetic crowns.

    PubMed

    Sonza, Queli Nunes; Della Bona, Alvaro; Borba, Márcia

    2014-05-01

    To evaluate the effect of infrastructure (IS) material on the fracture behavior of prosthetic crowns. Restorations were fabricated using a metal die simulating a prepared tooth. Four groups were evaluated: YZ-C, Y-TZP (In-Ceram YZ, Vita) IS produced by CAD-CAM; IZ-C, In-Ceram Zirconia (Vita) IS produced by CAD-CAM; IZ-S, In-Ceram Zirconia (Vita) IS produced by slip-cast; MC, metal IS (control). The IS were veneered with porcelain and resin cemented to fiber-reinforced composite dies. Specimens were loaded in compression to failure using a universal testing machine. The 30° angle load was applied by a spherical piston, in 37°C distilled water. Fractography was performed using stereomicroscope and SEM. Data were statistically analyzed with Anova and Student-Newman-Keuls tests (α=0.05). Significant differences were found between groups (p=0.022). MC showed the highest mean failure load, statistically similar to YZ-C. There was no statistical difference between YZ-C, IZ-C and IZ-S. MC and YZ-C showed no catastrophic failure. IZ-C and IZ-S showed chipping and catastrophic failures. The fracture behavior is similar to reported clinical failures. Considering the ceramic systems evaluated, YZ-C and MC crowns present greater fracture load and a more favorable failure mode than In-Ceram Zirconia crowns, regardless of the fabrication type (CAD-CAM or slip-cast). Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  19. Test Bus Evaluation

    DTIC Science & Technology

    1998-04-01

    selected is statistically based on the total number of faults and the failure rate distribution in the system under test. The fault set is also...implemented the BPM and system level emulation consolidation logic as well as statistics counters for cache misses and various bus transactions. These...instruction F22 Advanced Tactical Fighter FET Field Effect Transitor FF Flip-Flop FM Failures/Milhon hours C-3 FPGA Field Programmable Gate Array GET

  20. Analysis of risk factors for central venous port failure in cancer patients

    PubMed Central

    Hsieh, Ching-Chuan; Weng, Hsu-Huei; Huang, Wen-Shih; Wang, Wen-Ke; Kao, Chiung-Lun; Lu, Ming-Shian; Wang, Chia-Siu

    2009-01-01

    AIM: To analyze the risk factors for central port failure in cancer patients administered chemotherapy, using univariate and multivariate analyses. METHODS: A total of 1348 totally implantable venous access devices (TIVADs) were implanted into 1280 cancer patients in this cohort study. A Cox proportional hazard model was applied to analyze risk factors for failure of TIVADs. Log-rank test was used to compare actuarial survival rates. Infection, thrombosis, and surgical complication rates (χ2 test or Fisher’s exact test) were compared in relation to the risk factors. RESULTS: Increasing age, male gender and open-ended catheter use were significant risk factors reducing survival of TIVADs as determined by univariate and multivariate analyses. Hematogenous malignancy decreased the survival time of TIVADs; this reduction was not statistically significant by univariate analysis [hazard ratio (HR) = 1.336, 95% CI: 0.966-1.849, P = 0.080)]. However, it became a significant risk factor by multivariate analysis (HR = 1.499, 95% CI: 1.079-2.083, P = 0.016) when correlated with variables of age, sex and catheter type. Close-ended (Groshong) catheters had a lower thrombosis rate than open-ended catheters (2.5% vs 5%, P = 0.015). Hematogenous malignancy had higher infection rates than solid malignancy (10.5% vs 2.5%, P < 0.001). CONCLUSION: Increasing age, male gender, open-ended catheters and hematogenous malignancy were risk factors for TIVAD failure. Close-ended catheters had lower thrombosis rates and hematogenous malignancy had higher infection rates. PMID:19787834

  1. The influence of the compression interface on the failure behavior and size effect of concrete

    NASA Astrophysics Data System (ADS)

    Kampmann, Raphael

    The failure behavior of concrete materials is not completely understood because conventional test methods fail to assess the material response independent of the sample size and shape. To study the influence of strength and strain affecting test conditions, four typical concrete sample types were experimentally evaluated in uniaxial compression and analyzed for strength, deformational behavior, crack initiation/propagation, and fracture patterns under varying boundary conditions. Both low friction and conventional compression interfaces were assessed. High-speed video technology was used to monitor macrocracking. Inferential data analysis proved reliably lower strength results for reduced surface friction at the compression interfaces, regardless of sample shape. Reciprocal comparisons revealed statistically significant strength differences between most sample shapes. Crack initiation and propagation was found to differ for dissimilar compression interfaces. The principal stress and strain distributions were analyzed, and the strain domain was found to resemble the experimental results, whereas the stress analysis failed to explain failure for reduced end confinement. Neither stresses nor strains indicated strength reductions due to reduced friction, and therefore, buckling effects were considered. The high-speed video analysis revealed localize buckling phenomena, regardless of end confinement. Slender elements were the result of low friction, and stocky fragments developed under conventional confinement. The critical buckling load increased accordingly. The research showed that current test methods do not reflect the "true'' compressive strength and that concrete failure is strain driven. Ultimate collapse results from buckling preceded by unstable cracking.

  2. Restorations in abrasion/erosion cervical lesions: 8-year results of a triple blind randomized controlled trial.

    PubMed

    Dall'Orologio, Giovanni Dondi; Lorenzi, Roberta

    2014-10-01

    An equivalence randomized controlled trial within the subject was organized to evaluate the clinical long-term success of a new 2-step etch & rinse adhesive and a new nano-filled ormocer. 50 subjects, 21 males and 29 females aged between 21 and 65, were randomized to receive 150 restorations, 100 with the new restorative material, 50 with the composite as control, placed in non-carious cervical lesions with the same bonding system. The main outcome measure was the cause of failure at 8 years. Randomization was number table-generated, with allocation concealment by opaque sequentially numbered sealed and stapled envelopes. Subjects, examiner, and analyst were blinded to group assignment. Two interim analyses were performed. Data were analyzed by ANOVA and Cox test (P < 0.05). After 8 years, 40 subjects and 120 teeth were included in the analysis of the primary outcome. There were eight failures in the experimental group and four failures in the control group. The cumulative loss rate was 7% for both restorative materials, with the annual failure lower than 1%, without any statistically significant difference. There were two key elements of failure: the presence of sclerotic dentin and the relationship between lesion and gingival margin.

  3. Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending

    NASA Astrophysics Data System (ADS)

    Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus

    2013-06-01

    Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.

  4. Failure to activate the in-hospital emergency team: causes and outcomes.

    PubMed

    Barbosa, Vera; Gomes, Ernestina; Vaz, Senio; Azevedo, Gustavo; Fernandes, Gonçalo; Ferreira, Amélia; Araujo, Rui

    2016-01-01

    To determine the incidence of afferent limb failure of the in-hospital Medical Emergency Team, characterizing it and comparing the mortality between the population experiencing afferent limb failure and the population not experiencing afferent limb failure. A total of 478 activations of the Medical Emergency Team of Hospital Pedro Hispano occurred from January 2013 to July 2015. A sample of 285 activations was obtained after excluding incomplete records and activations for patients with less than 6 hours of hospitalization. The sample was divided into two groups: the group experiencing afferent limb failure and the group not experiencing afferent limb failure of the Medical Emergency Team. Both populations were characterized and compared. Statistical significance was set at p ≤ 0.05. Afferent limb failure was observed in 22.1% of activations. The causal analysis revealed significant differences in Medical Emergency Team activation criteria (p = 0.003) in the group experiencing afferent limb failure, with higher rates of Medical Emergency Team activation for cardiac arrest and cardiovascular dysfunction. Regarding patient outcomes, the group experiencing afferent limb failure had higher immediate mortality rates and higher mortality rates at hospital discharge, with no significant differences. No significant differences were found for the other parameters. The incidence of cardiac arrest and the mortality rate were higher in patients experiencing failure of the afferent limb of the Medical Emergency Team. This study highlights the need for health units to invest in the training of all healthcare professionals regarding the Medical Emergency Team activation criteria and emergency medical response system operations.

  5. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  6. The Effect of Fiber Strength Stochastics and Local Fiber Volume Fraction on Multiscale Progressive Failure of Composites

    NASA Technical Reports Server (NTRS)

    Ricks, Trenton M.; Lacy, Jr., Thomas E.; Bednarcyk, Brett A.; Arnold, Steven M.

    2013-01-01

    Continuous fiber unidirectional polymer matrix composites (PMCs) can exhibit significant local variations in fiber volume fraction as a result of processing conditions that can lead to further local differences in material properties and failure behavior. In this work, the coupled effects of both local variations in fiber volume fraction and the empirically-based statistical distribution of fiber strengths on the predicted longitudinal modulus and local tensile strength of a unidirectional AS4 carbon fiber/ Hercules 3502 epoxy composite were investigated using the special purpose NASA Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC); local effective composite properties were obtained by homogenizing the material behavior over repeating units cells (RUCs). The predicted effective longitudinal modulus was relatively insensitive to small (8%) variations in local fiber volume fraction. The composite tensile strength, however, was highly dependent on the local distribution in fiber strengths. The RUC-averaged constitutive response can be used to characterize lower length scale material behavior within a multiscale analysis framework that couples the NASA code FEAMAC and the ABAQUS finite element solver. Such an approach can be effectively used to analyze the progressive failure of PMC structures whose failure initiates at the RUC level. Consideration of the effect of local variations in constituent properties and morphologies on progressive failure of PMCs is a central aspect of the application of Integrated Computational Materials Engineering (ICME) principles for composite materials.

  7. Right ventricular functional analysis utilizing first pass radionuclide angiography for pre-operative ventricular assist device planning: a multi-modality comparison.

    PubMed

    Avery, Ryan; Day, Kevin; Jokerst, Clinton; Kazui, Toshinobu; Krupinski, Elizabeth; Khalpey, Zain

    2017-10-10

    Advanced heart failure treated with a left ventricular assist device is associated with a higher risk of right heart failure. Many advanced heart failures patients are treated with an ICD, a relative contraindication to MRI, prior to assist device placement. Given this limitation, left and right ventricular function for patients with an ICD is calculated using radionuclide angiography utilizing planar multigated acquisition (MUGA) and first pass radionuclide angiography (FPRNA), respectively. Given the availability of MRI protocols that can accommodate patients with ICDs, we have correlated the findings of ventricular functional analysis using radionuclide angiography to cardiac MRI, the reference standard for ventricle function calculation, to directly correlate calculated ejection fractions between these modalities, and to also assess agreement between available echocardiographic and hemodynamic parameters of right ventricular function. A retrospective review from January 2012 through May 2014 was performed to identify advanced heart failure patients who underwent both cardiac MRI and radionuclide angiography for ventricular functional analysis. Nine heart failure patients (8 men, 1 woman; mean age of 57.0 years) were identified. The average time between the cardiac MRI and radionuclide angiography exams was 38.9 days (range: 1 - 119 days). All patients undergoing cardiac MRI were scanned using an institutionally approved protocol for ICD with no device-related complications identified. A retrospective chart review of each patient for cardiomyopathy diagnosis, clinical follow-up, and echocardiogram and right heart catheterization performed during evaluation was also performed. The 9 patients demonstrated a mean left ventricular ejection fraction (LVEF) using cardiac MRI of 20.7% (12 - 40%). Mean LVEF using MUGA was 22.6% (12 - 49%). The mean right ventricular ejection fraction (RVEF) utilizing cardiac MRI was 28.3% (16 - 43%), and the mean RVEF calculated by FPRNA was 32.6% (9 - 56%). The mean discrepancy for LVEF between cardiac MRI and MUGA was 4.1% (0 - 9%), and correlation of calculated LVEF using cardiac MRI and MUGA demonstrated an R of 0.9. The mean discrepancy for RVEF between cardiac MRI and FPRNA was 12.0% (range: 2 - 24%) with a moderate correlation (R = 0.5). The increased discrepancies for RV analysis were statistically significant using an unpaired t-test (t = 3.19, p = 0.0061). Echocardiogram parameters of RV function, including TAPSE and FAC, were for available for all 9 patients and agreement with cardiac MRI demonstrated a kappa statistic for TAPSE of 0.39 (95% CI of 0.06 - 0.72) and for FAC of 0.64 (95% of 0.21 - 1.00). Heart failure patients are increasingly requiring left ventricular assist device placement; however, definitive evaluation of biventricular function is required due to the increased mortality rate associated with right heart failure after assist device placement. Our results suggest that FPRNA only has a moderate correlation with reference standard RVEFs calculated using cardiac MRI, which was similar to calculated agreements between cardiac MRI and echocardiographic parameters of right ventricular function. Given the need for identification of patients at risk for right heart failure, further studies are warranted to determine a more accurate estimate of RVEF for heart failure patients during pre-operative ventricular assist device planning.

  8. Nosocomial pneumonia caused by methicillin-resistant Staphylococcus aureus treated with linezolid or vancomycin: A secondary economic analysis of resource use from a Spanish perspective.

    PubMed

    Rello, J; Nieto, M; Solé-Violán, J; Wan, Y; Gao, X; Solem, C T; De Salas-Cansado, M; Mesa, F; Charbonneau, C; Chastre, J

    2016-11-01

    Adopting a unique Spanish perspective, this study aims to assess healthcare resource utilization (HCRU) and the costs of treating nosocomial pneumonia (NP) produced by methicillin-resistant Staphylococcus aureus (MRSA) in hospitalized adults using linezolid or vancomycin. An evaluation is also made of the renal failure rate and related economic outcomes between study groups. An economic post hoc evaluation of a randomized, double-blind, multicenter phase 4 study was carried out. Nosocomial pneumonia due to MRSA in hospitalized adults. The modified intent to treat (mITT) population comprised 224 linezolid- and 224 vancomycin-treated patients. Costs and HCRU were evaluated between patients administered either linezolid or vancomycin, and between patients who developed renal failure and those who did not. Analysis of HCRU outcomes and costs. Total costs were similar between the linezolid- (€17,782±€9,615) and vancomycin-treated patients (€17,423±€9,460) (P=.69). The renal failure rate was significantly lower in the linezolid-treated patients (4% vs. 15%; P<.001). The total costs tended to be higher in patients who developed renal failure (€19,626±€10,840 vs. €17,388±€9,369; P=.14). Among the patients who developed renal failure, HCRU (days on mechanical ventilation: 13.2±10.7 vs. 7.6±3.6 days; P=.21; ICU stay: 14.4±10.5 vs. 9.9±6.6 days; P=.30; hospital stay: 19.5±9.5 vs. 16.1±11.0 days; P=.26) and cost (€17,219±€8,792 vs. €20,263±€11,350; P=.51) tended to be lower in the linezolid- vs. vancomycin-treated patients. There were no statistically significant differences in costs per patient-day between cohorts after correcting for mortality (€1000 vs. €1,010; P=.98). From a Spanish perspective, there were no statistically significant differences in total costs between the linezolid and vancomycin pneumonia cohorts. The drug cost corresponding to linezolid was partially offset by fewer renal failure adverse events. Copyright © 2016 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  9. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  10. Development of failure model for nickel cadmium cells

    NASA Technical Reports Server (NTRS)

    Gupta, A.

    1980-01-01

    The development of a method for the life prediction of nickel cadmium cells is discussed. The approach described involves acquiring an understanding of the mechanisms of degradation and failure and at the same time developing nondestructive evaluation techniques for the nickel cadmium cells. The development of a statistical failure model which will describe the mechanisms of degradation and failure is outlined.

  11. Enhanced Component Performance Study: Motor-Driven Pumps 1998–2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2016-02-01

    This report presents an enhanced performance evaluation of motor-driven pumps at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience failure reports from fiscal year 1998 through 2014 for the component reliability as reported in the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The motor-driven pump failure modes considered for standby systems are failure to start, failure to run less than or equal to one hour, and failure to run more than one hour; for normally running systems, the failure modes considered are failure to start and failure tomore » run. An eight hour unreliability estimate is also calculated and trended. The component reliability estimates and the reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. Statistically significant increasing trends were identified in pump run hours per reactor year. Statistically significant decreasing trends were identified for standby systems industry-wide frequency of start demands, and run hours per reactor year for runs of less than or equal to one hour.« less

  12. Failures to further developing orphan medicinal products after designation granted in Europe: an analysis of marketing authorisation failures and abandoned drugs.

    PubMed

    Giannuzzi, Viviana; Landi, Annalisa; Bosone, Enrico; Giannuzzi, Floriana; Nicotri, Stefano; Torrent-Farnell, Josep; Bonifazi, Fedele; Felisi, Mariagrazia; Bonifazi, Donato; Ceci, Adriana

    2017-09-11

    The research and development process in the field of rare diseases is characterised by many well-known difficulties, and a large percentage of orphan medicinal products do not reach the marketing approval.This work aims at identifying orphan medicinal products that failed the developmental process and investigating reasons for and possible factors influencing failures. Drugs designated in Europe under Regulation (European Commission) 141/2000 in the period 2000-2012 were investigated in terms of the following failures: (1) marketing authorisation failures (refused or withdrawn) and (2) drugs abandoned by sponsors during development.Possible risk factors for failure were analysed using statistically validated methods. This study points out that 437 out of 788 designations are still under development, while 219 failed the developmental process. Among the latter, 34 failed the marketing authorisation process and 185 were abandoned during the developmental process. In the first group of drugs (marketing authorisation failures), 50% reached phase II, 47% reached phase III and 3% reached phase I, while in the second group (abandoned drugs), the majority of orphan medicinal products apparently never started the development process, since no data on 48.1% of them were published and the 3.2% did not progress beyond the non-clinical stage.The reasons for failures of marketing authorisation were: efficacy/safety issues (26), insufficient data (12), quality issues (7), regulatory issues on trials (4) and commercial reasons (1). The main causes for abandoned drugs were efficacy/safety issues (reported in 54 cases), inactive companies (25.4%), change of company strategy (8.1%) and drug competition (10.8%). No information concerning reasons for failure was available for 23.2% of the analysed products. This analysis shows that failures occurred in 27.8% of all designations granted in Europe, the main reasons being safety and efficacy issues. Moreover, the stage of development reached by drugs represents a specific risk factor for failures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Assessment of Various Risk Factors for Success of Delayed and Immediate Loaded Dental Implants: A Retrospective Analysis.

    PubMed

    Prasant, M C; Thukral, Rishi; Kumar, Sachin; Sadrani, Sannishth M; Baxi, Harsh; Shah, Aditi

    2016-10-01

    Ever since its introduction in 1977, a minimum of few months of period is required for osseointegration to take place after dental implant surgery. With the passage of time and advancements in the fields of dental implant, this healing period is getting smaller and smaller. Immediate loading of dental implants is becoming a very popular procedure in the recent time. Hence, we retrospectively analyzed the various risk factors for the failure of delayed and immediate loaded dental implants. In the present study, retrospective analysis of all the patients was done who underwent dental implant surgeries either by immediate loading procedure or by delayed loading procedures. All the patients were divided broadly into two groups with one group containing patients in which delayed loaded dental implants were placed while other consisted of patients in whom immediate loaded dental implants were placed. All the patients in whom follow-up records were missing and who had past medical history of any systemic diseases were excluded from the present study. Evaluation of associated possible risk factors was done by classifying the predictable factors as primary and secondary factors. All the results were analyzed by Statistical Package for the Social Sciences (SPSS) software. Kaplan-Meier survival analyses and chi-square test were used for assessment of level of significance. In delayed and immediate group of dental implants, mean age of the patients was 54.2 and 54.8 years respectively. Statistically significant results were obtained while comparing the clinical parameters of the dental implants in both the groups while demographic parameters showed nonsignificant correlation. Significant higher risk of dental implant failure is associated with immediate loaded dental implants. Tobacco smoking, shorter implant size, and other risk factors play a significant role in predicting the success and failure of dental implants. Delayed loaded dental implant placement should be preferred as they are associated with decreased risk of implant failure.

  14. Evaluation program for secondary spacecraft cells

    NASA Technical Reports Server (NTRS)

    Christy, D. E.; Harkness, J. D.

    1973-01-01

    A life cycle test of secondary electric batteries for spacecraft applications was conducted. A sample number of nickel cadmium batteries were subjected to general performance tests to determine the limit of their actual capabilities. Weaknesses discovered in cell design are reported and aid in research and development efforts toward improving the reliability of spacecraft batteries. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is provided.

  15. Review of Literature on Probability of Detection for Liquid Penetrant Nondestructive Testing

    DTIC Science & Technology

    2011-11-01

    increased maintenance costs , or catastrophic failure of safety- critical structure. Knowledge of the reliability achieved by NDT methods, including...representative components to gather data for statistical analysis, which can be prohibitively expensive. To account for sampling variability inherent in any...Sioux City and Pensacola. (Those recommendations were discussed in Section 3.4.) Drury et al report on a factorial experiment aimed at identifying the

  16. Sensitivity of goodness-of-fit statistics to rainfall data rounding off

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Puliga, Michelangelo

    An analysis based on the L-moments theory suggests of adopting the generalized Pareto distribution to interpret daily rainfall depths recorded by the rain-gauge network of the Hydrological Survey of the Sardinia Region. Nevertheless, a big problem, not yet completely resolved, arises in the estimation of a left-censoring threshold able to assure a good fitting of rainfall data with the generalized Pareto distribution. In order to detect an optimal threshold, keeping the largest possible number of data, we chose to apply a “failure-to-reject” method based on goodness-of-fit tests, as it was proposed by Choulakian and Stephens [Choulakian, V., Stephens, M.A., 2001. Goodness-of-fit tests for the generalized Pareto distribution. Technometrics 43, 478-484]. Unfortunately, the application of the test, using percentage points provided by Choulakian and Stephens (2001), did not succeed in detecting a useful threshold value in most analyzed time series. A deeper analysis revealed that these failures are mainly due to the presence of large quantities of rounding off values among sample data, affecting the distribution of goodness-of-fit statistics and leading to significant departures from percentage points expected for continuous random variables. A procedure based on Monte Carlo simulations is thus proposed to overcome these problems.

  17. Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.

  18. Computing Reliabilities Of Ceramic Components Subject To Fracture

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.

    1992-01-01

    CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.

  19. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  20. Application of ICH Q9 Quality Risk Management Tools for Advanced Development of Hot Melt Coated Multiparticulate Systems.

    PubMed

    Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh

    2017-01-01

    This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  1. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  2. Prediction of Hip Failure Load: In Vitro Study of 80 Femurs Using Three Imaging Methods and Finite Element Models-The European Fracture Study (EFFECT).

    PubMed

    Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie

    2016-09-01

    Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations, the best model combined volumetric BMD and a moment of inertia (r(2) = 0.78, P < .001; r(2) = 0.85, P < .001). FEM explained 87% (P < .001) and 83% (P < .001) of bone strength, respectively. By combining (a) radiography and DXA and (b) quantitative CT and DXA, correlations with mechanical failure load increased to 0.82 (P < .001) and 0.84 (P < .001), respectively, for radiography and DXA and to 0.80 (P < .001) and 0.86 (P < .001) , respectively, for quantitative CT and DXA. Conclusion Quantitative CT-based FEM was the best method with which to predict the experimental failure load; however, combining quantitative CT and DXA yielded a performance as good as that attained with FEM. The quantitative CT DXA combination may be easier to use in fracture prediction, provided standardized software is developed. These findings also highlight the major influence on femoral failure load, particularly in the trochanteric region, of a densitometric parameter combined with a geometric parameter. (©) RSNA, 2016 Online supplemental material is available for this article.

  3. The diagnostic accuracy of the natriuretic peptides in heart failure: systematic review and diagnostic meta-analysis in the acute care setting.

    PubMed

    Roberts, Emmert; Ludman, Andrew J; Dworzynski, Katharina; Al-Mohammad, Abdallah; Cowie, Martin R; McMurray, John J V; Mant, Jonathan

    2015-03-04

    To determine and compare the diagnostic accuracy of serum natriuretic peptide levels (B type natriuretic peptide, N terminal probrain natriuretic peptide (NTproBNP), and mid-regional proatrial natriuretic peptide (MRproANP)) in people presenting with acute heart failure to acute care settings using thresholds recommended in the 2012 European Society of Cardiology guidelines for heart failure. Systematic review and diagnostic meta-analysis. Medline, Embase, Cochrane central register of controlled trials, Cochrane database of systematic reviews, database of abstracts of reviews of effects, NHS economic evaluation database, and Health Technology Assessment up to 28 January 2014, using combinations of subject headings and terms relating to heart failure and natriuretic peptides. Eligible studies evaluated one or more natriuretic peptides (B type natriuretic peptide, NTproBNP, or MRproANP) in the diagnosis of acute heart failure against an acceptable reference standard in consecutive or randomly selected adults in an acute care setting. Studies were excluded if they did not present sufficient data to extract or calculate true positives, false positives, false negatives, and true negatives, or report age independent natriuretic peptide thresholds. Studies not available in English were also excluded. 37 unique study cohorts described in 42 study reports were included, with a total of 48 test evaluations reporting 15 263 test results. At the lower recommended thresholds of 100 ng/L for B type natriuretic peptide and 300 ng/L for NTproBNP, the natriuretic peptides have sensitivities of 0.95 (95% confidence interval 0.93 to 0.96) and 0.99 (0.97 to 1.00) and negative predictive values of 0.94 (0.90 to 0.96) and 0.98 (0.89 to 1.0), respectively, for a diagnosis of acute heart failure. At the lower recommended threshold of 120 pmol/L, MRproANP has a sensitivity ranging from 0.95 (range 0.90-0.98) to 0.97 (0.95-0.98) and a negative predictive value ranging from 0.90 (0.80-0.96) to 0.97 (0.96-0.98). At higher thresholds the sensitivity declined progressively and specificity remained variable across the range of values. There was no statistically significant difference in diagnostic accuracy between plasma B type natriuretic peptide and NTproBNP. At the rule-out thresholds recommended in the 2012 European Society of Cardiology guidelines for heart failure, plasma B type natriuretic peptide, NTproBNP, and MRproANP have excellent ability to exclude acute heart failure. Specificity is variable, and so imaging to confirm a diagnosis of heart failure is required. There is no statistical difference between the diagnostic accuracy of plasma B type natriuretic peptide and NTproBNP. Introduction of natriuretic peptide measurement in the investigation of patients with suspected acute heart failure has the potential to allow rapid and accurate exclusion of the diagnosis. © Roberts et al 2015.

  4. The Problem of Auto-Correlation in Parasitology

    PubMed Central

    Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick

    2012-01-01

    Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865

  5. Development of a clinically validated bulk failure test for ceramic crowns.

    PubMed

    Kelly, J Robert; Rungruanganunt, Patchnee; Hunter, Ben; Vailati, Francesca

    2010-10-01

    Traditional testing of ceramic crowns creates a stress state and damage modes that differ greatly from those seen clinically. There is a need to develop and communicate an in vitro testing protocol that is clinically valid. The purpose of this study was to develop an in vitro failure test for ceramic single-unit prostheses that duplicates the failure mechanism and stress state observed in clinically failed prostheses. This article first compares characteristics of traditional load-to-failure tests of ceramic crowns with the growing body of evidence regarding failure origins and stress states at failure from the examination of clinically failed crowns, finite element analysis (FEA), and data from clinical studies. Based on this analysis, an experimental technique was systematically developed and test materials were identified to recreate key aspects of clinical failure in vitro. One potential dentin analog material (an epoxy filled with woven glass fibers; NEMA grade G10) was evaluated for elastic modulus in blunt contact and for bond strength to resin cement as compared to hydrated dentin. Two bases with different elastic moduli (nickel chrome and resin-based composite) were tested for influence on failure loads. The influence of water during storage and loading (both monotonic and cyclic) was examined. Loading piston materials (G10, aluminum, stainless steel) and piston designs were varied to eliminate Hertzian cracking and to improve performance. Testing was extended from a monolayer ceramic (leucite-filled glass) to a bilayer ceramic system (glass-infiltrated alumina). The influence of cyclic rate on mean failure loads was examined (2 Hz, 10 Hz, 20 Hz) with the extremes compared statistically (t test; α=.05). Failure loads were highly influenced by base elastic modulus (t test; P<.001). Cyclic loading while in water significantly decreased mean failure loads (1-way ANOVA; P=.003) versus wet storage/dry cycling (350 N vs. 1270 N). G10 was not significantly different from hydrated dentin in terms of blunt contact elastic behavior or resin cement bond strength. Testing was successful with the bilayered ceramic, and the cycling rate altered mean failure loads only slightly (approximately 5%). Test methods and materials were developed to validly simulate many aspects of clinical failure. Copyright © 2010 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  6. Quantifying effectiveness of failure prediction and response in HPC systems : methodology and example.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Chen, Frank Xiaoxiao; Pebay, Philippe Pierre

    2010-06-01

    Effective failure prediction and mitigation strategies in high-performance computing systems could provide huge gains in resilience of tightly coupled large-scale scientific codes. These gains would come from prediction-directed process migration and resource servicing, intelligent resource allocation, and checkpointing driven by failure predictors rather than at regular intervals based on nominal mean time to failure. Given probabilistic associations of outlier behavior in hardware-related metrics with eventual failure in hardware, system software, and/or applications, this paper explores approaches for quantifying the effects of prediction and mitigation strategies and demonstrates these using actual production system data. We describe context-relevant methodologies for determining themore » accuracy and cost-benefit of predictors. While many research studies have quantified the expected impact of growing system size, and the associated shortened mean time to failure (MTTF), on application performance in large-scale high-performance computing (HPC) platforms, there has been little if any work to quantify the possible gains from predicting system resource failures with significant but imperfect accuracy. This possibly stems from HPC system complexity and the fact that, to date, no one has established any good predictors of failure in these systems. Our work in the OVIS project aims to discover these predictors via a variety of data collection techniques and statistical analysis methods that yield probabilistic predictions. The question then is, 'How good or useful are these predictions?' We investigate methods for answering this question in a general setting, and illustrate them using a specific failure predictor discovered on a production system at Sandia.« less

  7. Significant Pre-Accession Factors Predicting Success or Failure During a Marine Corps Officer’s Initial Service Obligation

    DTIC Science & Technology

    2015-12-01

    WAIVERS ..............................................................................................49  APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23  Table 6.  Summary Statistics of Academics Variables...24  Table 7.  Summary Statistics of Application Variables ............................................25  Table 8

  8. Increased hospital admissions associated with extreme-heat exposure in King County, Washington, 1990-2010.

    PubMed

    Isaksen, Tania Busch; Yost, Michael G; Hom, Elizabeth K; Ren, You; Lyons, Hilary; Fenske, Richard A

    2015-01-01

    Increased morbidity and mortality have been associated with extreme heat events, particularly in temperate climates. Few epidemiologic studies have considered the impact of extreme heat events on hospitalization rates in the Pacific Northwest region. This study quantifies the historic (May to September 1990-2010) heat-morbidity relationship in the most populous Pacific Northwest County, King County, Washington. A relative risk (RR) analysis was used to explore the association between heat and all non-traumatic hospitalizations on 99th percentile heat days, whereas a time series analysis using a piecewise linear model approximation was used to estimate the effect of heat intensity on hospitalizations, adjusted for temporal trends and day of the week. A non-statistically significant 2% [95% CI: 1.02 (0.98, 1.05)] increase in hospitalization risk, on a heat day vs. a non-heat day, was noted for all-ages and all non-traumatic causes. When considering the effect of heat intensity on admissions, we found a statistically significant 1.59% (95% CI: 0.9%, 2.29%) increase in admissions per degree increase in humidex above 37.4°C. Admissions stratified by cause and age produced statistically significant results with both relative risk and time series analyses for nephritis and nephrotic syndromes, acute renal failure, and natural heat exposure hospitalizations. This study demonstrates that heat, expressed as humidex, is associated with increased hospital admissions. When stratified by age and cause of admission, the non-elderly age groups (<85 years) experience significant risk for nephritis and nephrotic syndromes, acute renal failure, natural heat exposure, chronic obstructive pulmonary disease, and asthma hospitalizations.

  9. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    PubMed

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value <0.05 was considered as significant value. Total 54 systemic and 62 local complications occurred during three months of analyse and measure phase. Syncope, failure of anaesthesia, trismus, auto mordeduras and pain at injection site was found to be most recurring complications. Cumulative defective percentage was 7.99 in case of pre-improved data and decreased to 4.58 in the control phase. Estimate for difference was 0.0341228 and 95% lower bound for difference was 0.0193966. p-value was found to be highly significant with p= 0.000. The application of six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  10. Impact of different variables on the outcome of patients with clinically confined prostate carcinoma: prediction of pathologic stage and biochemical failure using an artificial neural network.

    PubMed

    Ziada, A M; Lisle, T C; Snow, P B; Levine, R F; Miller, G; Crawford, E D

    2001-04-15

    The advent of advanced computing techniques has provided the opportunity to analyze clinical data using artificial intelligence techniques. This study was designed to determine whether a neural network could be developed using preoperative prognostic indicators to predict the pathologic stage and time of biochemical failure for patients who undergo radical prostatectomy. The preoperative information included TNM stage, prostate size, prostate specific antigen (PSA) level, biopsy results (Gleason score and percentage of positive biopsy), as well as patient age. All 309 patients underwent radical prostatectomy at the University of Colorado Health Sciences Center. The data from all patients were used to train a multilayer perceptron artificial neural network. The failure rate was defined as a rise in the PSA level > 0.2 ng/mL. The biochemical failure rate in the data base used was 14.2%. Univariate and multivariate analyses were performed to validate the results. The neural network statistics for the validation set showed a sensitivity and specificity of 79% and 81%, respectively, for the prediction of pathologic stage with an overall accuracy of 80% compared with an overall accuracy of 67% using the multivariate regression analysis. The sensitivity and specificity for the prediction of failure were 67% and 85%, respectively, demonstrating a high confidence in predicting failure. The overall accuracy rates for the artificial neural network and the multivariate analysis were similar. Neural networks can offer a convenient vehicle for clinicians to assess the preoperative risk of disease progression for patients who are about to undergo radical prostatectomy. Continued investigation of this approach with larger data sets seems warranted. Copyright 2001 American Cancer Society.

  11. [Psycho-social factors of sexual failure among newly married Uyghur young males].

    PubMed

    Erkin, Ashim; Hamrajan, Memtili; Kadirjan, Mijit; Adil, Eli; Elijan, Abdureshit; Ibrahim, Ubul; Abdulla, Tursun; Hasanjan, Abdurehim; Turgun, Hekim; Eli, Ablet; Eset, Metmusa

    2016-08-01

    To study the psycho-social risk factors of sexual failure among newly married young males in the Uyghur population. We conducted a paired case control study of 186 newly married Uyghur young males (aged 17-30 [23.4±2.9] yr) with sexual failure and another 186 (aged 18-34 [24.0±3.1] yr) with no such problem as controls. We performed a logistic regression analysis on the possible psycho-social risk factors of this condition. Logistic regression analysis showed that the risk factors of sexual failure among the newly married men included personality (OR=0.271, 95% CI 0.176-0.420), income (OR=0.391, 95% CI 0.264-0.580), history of masturbation (OR=0.824, 95% CI 0.710-0.956), premarital sex (OR=0.757, 95% CI 0.677-0.847), sense of obligation (OR=1.756, 95% CI 1.157-2.693), equality of the social status (OR=0.574, 95% CI 0.435-0.756), degree of mutual care (OR=1.605, 95% CI 1.268-2.032), female's psychological obstacle (OR=2.832, 95% CI 1.221-6.569), and religion (OR=0.643, 95% CI 0.472-0.967). There was a statistical significance in the correlation between these factors and sexual failure in the newly married males (all P<0.05). Sexual failure of newly married Uyghur young males are associated with many psycho-social factors, which necessitates sexual education among young males and particularly pre-marriage sexual education and psychological guide among both males and females.

  12. Arthroscopic suture anchor repair of the lateral ligament ankle complex: a cadaveric study.

    PubMed

    Giza, Eric; Shin, Edward C; Wong, Stephanie E; Acevedo, Jorge I; Mangone, Peter G; Olson, Kirstina; Anderson, Matthew J

    2013-11-01

    Operative treatment of mechanical ankle instability is indicated for patients with multiple sprains and continued episodes of instability. Open repair of the lateral ankle ligaments involves exposure of the attenuated ligaments and advancement back to their anatomic insertions on the fibula using bone tunnels or suture implants. Open and arthroscopic fixation are equal in strength to failure for anatomic Broström repair. Controlled laboratory study. Seven matched pairs of human cadaveric ankle specimens were randomized into 2 groups of anatomic Broström repair: open or arthroscopic. The calcaneofibular ligament and anterior talofibular ligament were excised from their origin on the fibula. In the open repair group, 2 suture anchors were used to reattach the ligaments to their anatomic origins. In the arthroscopic repair group, identical suture anchors were used for repair via an arthroscopic technique. The ligaments were cyclically loaded 20 times and then tested to failure. Torque to failure, degrees to failure, initial stiffness, and working stiffness were measured. A matched-pair analysis was performed. Power analysis of 0.8 demonstrated that 7 pairs needed to show a difference of 30%, with a 15% standard error at a significance level of α = .05. There was no difference in the degrees to failure, torque to failure, or stiffness for the repaired ligament complex. Nine of 14 specimens failed at the suture anchor. There is no statistical difference in strength or stiffness of a traditional open repair as compared with an arthroscopic anatomic repair of the lateral ligaments of the ankle. An arthroscopic technique can be considered for lateral ligament stabilization in patients with mild to moderate mechanical instability.

  13. Emotions and encounters with healthcare professionals as predictors for the self-estimated ability to return to work: a cross-sectional study of people with heart failure.

    PubMed

    Nordgren, Lena; Söderlund, Anne

    2016-11-09

    To live with heart failure means that life is delimited. Still, people with heart failure can have a desire to stay active in working life as long as possible. Although a number of factors affect sick leave and rehabilitation processes, little is known about sick leave and vocational rehabilitation concerning people with heart failure. This study aimed to identify emotions and encounters with healthcare professionals as possible predictors for the self-estimated ability to return to work in people on sick leave due to heart failure. A population-based cross-sectional study design was used. The study was conducted in Sweden. Data were collected in 2012 from 3 different sources: 2 official registries and 1 postal questionnaire. A total of 590 individuals were included. Descriptive statistics, correlation analysis and linear multiple regression analysis were used. 3 variables, feeling strengthened in the situation (β=-0.21, p=0.02), feeling happy (β=-0.24, p=0.02) and receiving encouragement about work (β=-0.32, p≤0.001), were identified as possible predictive factors for the self-estimated ability to return to work. To feel strengthened, happy and to receive encouragement about work can affect the return to work process for people on sick leave due to heart failure. In order to develop and implement rehabilitation programmes to meet these needs, more research is needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. Repeat Urethroplasty After Failed Urethral Reconstruction: Outcome Analysis of 130 Patients

    PubMed Central

    Blaschko, Sarah D.; McAninch, Jack W.; Myers, Jeremy B.; Schlomer, Bruce J.; Breyer, Benjamin N.

    2013-01-01

    Purpose Male urethral stricture disease accounts for a significant number of hospital admissions and health care expenditures. Although much research has been completed on treatment for urethral strictures, fewer studies have addressed the treatment of strictures in men with recurrent stricture disease after failed prior urethroplasty. We examined outcome results for repeat urethroplasty. Materials and Methods A prospectively collected, single surgeon urethroplasty database was queried from 1977 to 2011 for patients treated with repeat urethroplasty after failed prior urethral reconstruction. Stricture length and location, and repeat urethroplasty intervention and failure were evaluated with descriptive statistics, and univariate and multivariate logistic regression. Results Of 1,156 cases 168 patients underwent repeat urethroplasty after at least 1 failed prior urethroplasty. Of these patients 130 had a followup of 6 months or more and were included in analysis. Median patient age was 44 years (range 11 to 75). Median followup was 55 months (range 6 months to 20.75 years). Overall, 102 of 130 patients (78%) were successfully treated. For patients with failure median time to failure was 17 months (range 7 months to 16.8 years). Two or more failed prior urethroplasties and comorbidities associated with urethral stricture disease were associated with an increased risk of repeat urethroplasty failure. Conclusions Repeat urethroplasty is a successful treatment option. Patients in whom treatment failed had longer strictures and more complex repairs. PMID:23083654

  15. An analytical and experimental investigation of the response of the curved, composite frame/skin specimens

    NASA Technical Reports Server (NTRS)

    Moas, Eduardo; Boitnott, Richard L.; Griffin, O. Hayden, Jr.

    1994-01-01

    Six-foot diameter, semicircular graphite/epoxy specimens representative of generic aircraft frames were loaded quasi-statistically to determine their load response and failure mechanisms for large deflections that occur in airplanes crashes. These frame/skin specimens consisted of a cylindrical skin section co-cured with a semicircular I-frame. The skin provided the necessary lateral stiffness to keep deformations in the plane of the frame in order to realistically represent deformations as they occur in actual fuselage structures. Various frame laminate stacking sequences and geometries were evaluated by statically loading the specimen until multiple failures occurred. Two analytical methods were compared for modeling the frame/skin specimens: a two-dimensional shell finite element analysis and a one-dimensional, closed-form, curved beam solution derived using an energy method. Flange effectivities were included in the beam analysis to account for the curling phenomenon that occurs in thin flanges of curved beams. Good correlation was obtained between experimental results and the analytical predictions of the linear response of the frames prior to the initial failure. The specimens were found to be useful for evaluating composite frame designs.

  16. Photoresist and stochastic modeling

    NASA Astrophysics Data System (ADS)

    Hansen, Steven G.

    2018-01-01

    Analysis of physical modeling results can provide unique insights into extreme ultraviolet stochastic variation, which augment, and sometimes refute, conclusions based on physical intuition and even wafer experiments. Simulations verify the primacy of "imaging critical" counting statistics (photons, electrons, and net acids) and the image/blur-dependent dose sensitivity in describing the local edge or critical dimension variation. But the failure of simple counting when resist thickness is varied highlights a limitation of this exact analytical approach, so a calibratable empirical model offers useful simplicity and convenience. Results presented here show that a wide range of physical simulation results can be well matched by an empirical two-parameter model based on blurred image log-slope (ILS) for lines/spaces and normalized ILS for holes. These results are largely consistent with a wide range of published experimental results; however, there is some disagreement with the recently published dataset of De Bisschop. The present analysis suggests that the origin of this model failure is an unexpected blurred ILS:dose-sensitivity relationship failure in that resist process. It is shown that a photoresist mechanism based on high photodecomposable quencher loading and high quencher diffusivity can give rise to pitch-dependent blur, which may explain the discrepancy.

  17. Reliability-based management of buried pipelines considering external corrosion defects

    NASA Astrophysics Data System (ADS)

    Miran, Seyedeh Azadeh

    Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.

  18. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  19. Effect of Airborne Particle Abrasion on Microtensile Bond Strength of Total-Etch Adhesives to Human Dentin

    PubMed Central

    Piccioni, Chiara; Di Carlo, Stefano; Capogreco, Mario

    2017-01-01

    Aim of this study was to investigate a specific airborne particle abrasion pretreatment on dentin and its effects on microtensile bond strengths of four commercial total-etch adhesives. Midcoronal occlusal dentin of extracted human molars was used. Teeth were randomly assigned to 4 groups according to the adhesive system used: OptiBond FL (FL), OptiBond Solo Plus (SO), Prime & Bond (PB), and Riva Bond LC (RB). Specimens from each group were further divided into two subgroups: control specimens were treated with adhesive procedures; abraded specimens were pretreated with airborne particle abrasion using 50 μm Al2O3 before adhesion. After bonding procedures, composite crowns were incrementally built up. Specimens were sectioned perpendicular to adhesive interface to produce multiple beams, which were tested under tension until failure. Data were statistically analysed. Failure mode analysis was performed. Overall comparison showed significant increase in bond strength (p < 0.001) between abraded and no-abraded specimens, independently of brand. Intrabrand comparison showed statistical increase when abraded specimens were tested compared to no-abraded ones, with the exception of PB that did not show such difference. Distribution of failure mode was relatively uniform among all subgroups. Surface treatment by airborne particle abrasion with Al2O3 particles can increase the bond strength of total-etch adhesives. PMID:29392128

  20. Tissue angiotensin-converting enzyme inhibitors for the prevention of cardiovascular disease in patients with diabetes mellitus without left ventricular systolic dysfunction or clinical evidence of heart failure: a pooled meta-analysis of randomized placebo-controlled clinical trials.

    PubMed

    Saha, S A; Molnar, J; Arora, R R

    2008-01-01

    The aim of this study was to determine the role of tissue angiotensin-converting enzyme (ACE) inhibitors in the prevention of cardiovascular disease in patients with diabetes mellitus without left ventricular systolic dysfunction or clinical evidence of heart failure in randomized placebo-controlled clinical trials using pooled meta-analysis techniques. Randomized placebo-controlled clinical trials of at least 12 months duration in patients with diabetes mellitus without left ventricular systolic dysfunction or heart failure who had experienced a prior cardiovascular event or were at high cardiovascular risk were selected. A total of 10 328 patients (43 517 patient-years) from four selected trials were used for meta-analysis. Relative risk estimations were made using data pooled from the selected trials and statistical significance was determined using the Chi-squared test (two-sided alpha error <0.05). The number of patients needed to treat was also calculated. Tissue ACE inhibitors significantly reduced the risk of cardiovascular mortality by 14.9% (p = 0.022), myocardial infarction by 20.8% (p = 0.002) and the need for invasive coronary revascularization by 14% (p = 0.015) when compared to placebo. The risk of all-cause mortality also tended to be lower among patients randomized to tissue ACE inhibitors, whereas the risks of stroke and hospitalization for heart failure were not significantly affected. Treating about 65 patients with tissue ACE inhibitors for about 4.2 years would prevent one myocardial infarction, whereas treating about 85 patients would prevent one cardiovascular death. Pooled meta-analysis of randomized placebo-controlled trials suggests that tissue ACE inhibitors modestly reduce the risk of myocardial infarction and cardiovascular death and tend to reduce overall mortality in diabetic patients without left ventricular systolic dysfunction or heart failure.

  1. Expert system for online surveillance of nuclear reactor coolant pumps

    DOEpatents

    Gross, Kenny C.; Singer, Ralph M.; Humenik, Keith E.

    1993-01-01

    An expert system for online surveillance of nuclear reactor coolant pumps. This system provides a means for early detection of pump or sensor degradation. Degradation is determined through the use of a statistical analysis technique, sequential probability ratio test, applied to information from several sensors which are responsive to differing physical parameters. The results of sequential testing of the data provide the operator with an early warning of possible sensor or pump failure.

  2. Quantifying the Relationship between AMC Resources and U.S. Army Materiel Readiness

    DTIC Science & Technology

    1989-08-25

    Resource Management report 984 for the same period. Insufficient data precluded analysis of the OMA PEs Total Package Fielding and Life Cycle Software...procurement, had the greatest failure rates when subjected to the statistical tests merely because of the reduced number of data pairs. Analyses of...ENGINEERING DEVELOPMENT 6.5 - MANAGEMENT AND SUPPORT 6.7 - OPERATIONAL SYSTEM DEVELOPMENT P2 - GENERAL PURPOSE FORCES P3 - INTELIGENCE AND COMMUNICATIONS P7

  3. Design of Critical Components

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Zaretsky, Erwin V.

    2001-01-01

    Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.

  4. Enhanced Component Performance Study: Turbine-Driven Pumps 1998–2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2015-11-01

    This report presents an enhanced performance evaluation of turbine-driven pumps (TDPs) at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience failure reports from fiscal year 1998 through 2014 for the component reliability as reported in the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The TDP failure modes considered are failure to start (FTS), failure to run less than or equal to one hour (FTR=1H), failure to run more than one hour (FTR>1H), and normally running systems FTS and failure to run (FTR). The component reliability estimates and themore » reliability data are trended for the most recent 10-year period while yearly estimates for reliability are provided for the entire active period. Statistically significant increasing trends were identified for TDP unavailability, for frequency of start demands for standby TDPs, and for run hours in the first hour after start. Statistically significant decreasing trends were identified for start demands for normally running TDPs, and for run hours per reactor critical year for normally running TDPs.« less

  5. Turned versus anodised dental implants: a meta-analysis.

    PubMed

    Chrcanovic, B R; Albrektsson, T; Wennerberg, A

    2016-09-01

    The aim of this meta-analysis was to test the null hypothesis of no difference in the implant failure rates, marginal bone loss (MBL)and post-operative infection for patients being rehabilitated by turned versus anodised-surface implants, against the alternative hypothesis of a difference. An electronic search without time or language restrictions was undertaken in November 2015. Eligibility criteria included clinical human studies, either randomised or not. Thirty-eight publications were included. The results suggest a risk ratio of 2·82 (95% CI 1·95-4·06, P < 0·00001) for failure of turned implants, when compared to anodised-surface implants. Sensitivity analyses showed similar results when only the studies inserting implants in maxillae or mandibles were pooled. There were no statistically significant effects of turned implants on the MBL (mean difference-MD 0·02, 95%CI -0·16-0·20; P = 0·82) in comparison to anodised implants. The results of a meta-regression considering the follow-up period as a covariate suggested an increase of the MD with the increase in the follow-up time (MD increase 0·012 mm year(-1) ), however, without a statistical significance (P = 0·813). Due to lack of satisfactory information, meta-analysis for the outcome 'post-operative infection' was not performed. The results have to be interpreted with caution due to the presence of several confounding factors in the included studies. © 2016 John Wiley & Sons Ltd.

  6. Outcomes and Pharmacoeconomic Analysis of a Home Intravenous Antibiotic Infusion Program in Veterans.

    PubMed

    Ruh, Christine A; Parameswaran, Ganapathi I; Wojciechowski, Amy L; Mergenhagen, Kari A

    2015-11-01

    The use of outpatient parenteral antibiotic therapy (OPAT) programs has become more frequent because of benefits in costs with equivalent clinical outcomes compared with inpatient care. The purpose of this study was to evaluate the outcomes of our program. A modified pharmacoeconomic analysis was performed to compare costs of our program with hospital or rehabilitation facility care. This was a retrospective chart review of 96 courses of OPAT between April 1, 2011, and July 31, 2013. Clinical failures were defined as readmission or death due to worsening infection or readmission secondary to adverse drug event (ADE) to antibiotic therapy. This does not include those patients readmitted for reasons not associated with OPAT therapy, including comorbidities or elective procedures. Baseline characteristics and program-specific data were analyzed. Statistically significant variables were built into a multivariate logistic regression model to determine predictors of failure. A pharmacoeconomic analysis was performed with the use of billing records. Of the total episodes evaluated, 17 (17.71%) clinically failed therapy, and 79 (82.29%) were considered a success. In the multivariate analysis, number of laboratory draws (P = 0.02) and occurrence of drug reaction were significant in the final model, P = 0.02 and P = 0.001, respectively. The presence an adverse drug reaction increases the odds of failure (OR = 10.10; 95% CI, 2.69-44.90). Compared with inpatient or rehabilitation care, the cost savings was $6,932,552.03 or $2,649,870.68, respectively. In our study, patients tolerated OPAT well, with a low number of failures due to ADE. The clinical outcomes and cost savings of our program indicate that OPAT can be a viable alternative to long-term inpatient antimicrobial therapy. Published by Elsevier Inc.

  7. Observation of the efficacy of radiofrequency catheter ablation on patients with different forms of atrial fibrillation.

    PubMed

    Zhao, R-C; Han, W; Han, J; Yu, J; Guo, J; Fu, J-L; Li, Z; Zhao, R-Z

    2016-10-01

    To study the efficacy and safety of radiofrequency catheter ablation (RFCA) in patients with different forms of atrial fibrillation. By retrospective analysis, we summarize 720 cases, where patients diagnosed with atrial fibrillation in our hospital were treated with RFCA from February 2010 to October 2014. Among the cases, 425 were diagnosed with paroxysmal atrial fibrillation and 295 with non-paroxysmal atrial fibrillation (including persistent atrial fibrillation and permanent atrial fibrillation). All patients were followed up until June 2015 to compare and analyze the differences in operation success rates, complications and recurrence rates. 395 cases (92.9%) of paroxysmal atrial fibrillation and 253 cases (85.8%) with non-paroxysmal atrial fibrillation were subject to surgery and followed up. The age of onset, disease course, underlying diseases, left atrial diameter and combined anti-arrhythmics of patients with paroxysmal atrial fibrillation were lower than those of patients with non-paroxysmal atrial fibrillation, and the differences were statistically significant (p < 0.05). The success rate of the first ablation was higher than that of non-paroxysmal atrial fibrillation. Procedure time, procedure method, complications and recurrence rate of patients with paroxysmal atrial fibrillation were lower than those of non-paroxysmal atrial fibrillation group, and the differences were statistically significant (p < 0.05). When we compared apoplexy and heart failure caused by atrial fibrillation in the two groups, the difference was not statistically significant (Apoplexy: p = 0.186; Heart failure: p = 0.170). The individual ablation success rate was higher for paroxysmal atrial fibrillation, and long-term follow-up showed that the occurrence of apoplexy and heart failure was not different from the non-paroxysmal atrial fibrillation group.

  8. DESENSITIZING BIOACTIVE AGENTS IMPROVES BOND STRENGTH OF INDIRECT RESIN-CEMENTED RESTORATIONS: PRELIMINARY RESULTS

    PubMed Central

    Pires-De-Souza, Fernanda de Carvalho Panzeri; de Marco, Fabíola Fiorezi; Casemiro, Luciana Assirati; Panzeri, Heitor

    2007-01-01

    Objective: The aim of this study was to assess the bond strength of indirect composite restorations cemented with a resin-based cement associated with etch-and-rinse and self-etching primer adhesive systems to dentin treated or not with a bioactive material. Materials and Method: Twenty bovine incisor crowns had the buccal enamel removed and the dentin ground flat. The teeth were assigned to 4 groups (n=5): Group I: acid etching + Prime & Bond NT (Dentsply); Group II: application of a bioactive glass (Biosilicato®)+ acid etching + Prime & Bond NT; Group III: One-up Bond F (J Morita); Group IV: Biosilicato® + One-up Bond F. Indirect composite resin (Artglass, Kulzer) cylinders (6x10mm) were fabricated and cemented to the teeth with a dualcure resin-based cement (Enforce, Dentsply). After cementation, the specimens were stored in artificial saliva at 37oC for 30 days and thereafter tested in tensile strength in a universal testing machine (EMIC) with 50 kgf load cell at a crosshead speed of 1 mm/min. Failure modes were assessed under scanning electron microscopy. Data were analyzed statistically by ANOVA and Tukey's test (95% level of confidence). Results: Groups I, II and III had statistically similar results (p>0.05). Group IV had statistically significant higher bond strength means (p<0.05) than the other groups. The analysis of the debonded surfaces showed a predominance of adhesive failure mode for Group III and mixed failure mode for the other groups. Conclusion: The use of desensitizing agent did not affect negatively the bonding of the indirect composite restorations to dentin, independently of the tested adhesive systems. PMID:19089114

  9. Adhesive retention of experimental fiber-reinforced composite, orthodontic acrylic resin, and aliphatic urethane acrylate to silicone elastomer for maxillofacial prostheses.

    PubMed

    Kosor, Begüm Yerci; Artunç, Celal; Şahan, Heval

    2015-07-01

    A key factor of an implant-retained facial prosthesis is the success of the bonding between the substructure and the silicone elastomer. Little has been reported on the bonding of fiber reinforced composite (FRC) to silicone elastomers. Experimental FRC could be a solution for facial prostheses supported by light-activated aliphatic urethane acrylate, orthodontic acrylic resin, or commercially available FRCs. The purpose of this study was to evaluate the bonding of the experimental FRC, orthodontic acrylic resin, and light-activated aliphatic urethane acrylate to a commercially available high-temperature vulcanizing silicone elastomer. Shear and 180-degree peel bond strengths of 3 different substructures (experimental FRC, orthodontic acrylic resin, light-activated aliphatic urethane acrylate) (n=15) to a high-temperature vulcanizing maxillofacial silicone elastomer (M511) with a primer (G611) were assessed after 200 hours of accelerated artificial light-aging. The specimens were tested in a universal testing machine at a cross-head speed of 10 mm/min. Data were collected and statistically analyzed by 1-way ANOVA, followed by the Bonferroni correction and the Dunnett post hoc test (α=.05). Modes of failure were visually determined and categorized as adhesive, cohesive, or mixed and were statistically analyzed with the chi-squared goodness-of-fit test (α=.05). As the mean shear bond strength values were evaluated statistically, no difference was found among the experimental FRC, aliphatic urethane acrylate, and orthodontic acrylic resin subgroups (P>.05). The mean peel bond strengths of experimental fiber reinforced composite and aliphatic urethane acrylate were not found to be statistically different (P>.05). The mean value of the orthodontic acrylic resin subgroup peel bond strength was found to be statistically lower (P<.05). Shear test failure types were found to be statistically different (P<.05), whereas 180-degree peel test failure types were not found to be statistically significant (P>.05). Shear forces predominantly exhibited cohesive failure (64.4%), whereas peel forces predominantly exhibited adhesive failure (93.3%). The mean shear bond strengths of the experimental FRC and aliphatic urethane acrylate groups were not found to be statistically different (P>.05). The mean value of the 180-degree peel strength of the orthodontic acrylic resin group was found to be lower (P<.05). Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  10. Data and Statistics: Heart Failure

    MedlinePlus

    ... commit" type="submit" value="Submit" /> Related CDC Web Sites Heart Disease Stroke High Blood Pressure Salt ... to Prevent and Control Chronic Diseases Million Hearts® Web Sites with More Information About Heart Failure For ...

  11. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  12. In-situ microscale through-silicon via strain measurements by synchrotron x-ray microdiffraction exploring the physics behind data interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xi; School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332; Thadesar, Paragkumar A.

    2014-09-15

    In-situ microscale thermomechanical strain measurements have been performed in combination with synchrotron x-ray microdiffraction to understand the fundamental cause of failures in microelectronics devices with through-silicon vias. The physics behind the raster scan and data analysis of the measured strain distribution maps is explored utilizing the energies of indexed reflections from the measured data and applying them for beam intensity analysis and effective penetration depth determination. Moreover, a statistical analysis is performed for the beam intensity and strain distributions along the beam penetration path to account for the factors affecting peak search and strain refinement procedure.

  13. Analysis of the progressive failure of brittle matrix composites

    NASA Technical Reports Server (NTRS)

    Thomas, David J.

    1995-01-01

    This report investigates two of the most common modes of localized failures, namely, periodic fiber-bridged matrix cracks and transverse matrix cracks. A modification of Daniels' bundle theory is combined with Weibull's weakest link theory to model the statistical distribution of the periodic matrix cracking strength for an individual layer. Results of the model predictions are compared with experimental data from the open literature. Extensions to the model are made to account for possible imperfections within the layer (i.e., nonuniform fiber lengths, irregular crack spacing, and degraded in-situ fiber properties), and the results of these studies are presented. A generalized shear-lag analysis is derived which is capable of modeling the development of transverse matrix cracks in material systems having a general multilayer configuration and under states of full in-plane load. A method for computing the effective elastic properties for the damaged layer at the global level is detailed based upon the solution for the effects of the damage at the local level. This methodology is general in nature and is therefore also applicable to (0(sub m)/90(sub n))(sub s) systems. The characteristic stress-strain response for more general cases is shown to be qualitatively correct (experimental data is not available for a quantitative evaluation), and the damage evolution is recorded in terms of the matrix crack density as a function of the applied strain. Probabilistic effects are introduced to account for the statistical nature of the material strengths, thus allowing cumulative distribution curves for the probability of failure to be generated for each of the example laminates. Additionally, Oh and Finney's classic work on fracture location in brittle materials is extended and combined with the shear-lag analysis. The result is an analytical form for predicting the probability density function for the location of the next transverse crack occurrence within a crack bounded region. The results of this study verified qualitatively the validity of assuming a uniform crack spacing (as was done in the shear-lag model).

  14. Continuous fiber ceramic matrix composites for heat engine components

    NASA Technical Reports Server (NTRS)

    Tripp, David E.

    1988-01-01

    High strength at elevated temperatures, low density, resistance to wear, and abundance of nonstrategic raw materials make structural ceramics attractive for advanced heat engine applications. Unfortunately, ceramics have a low fracture toughness and fail catastrophically because of overload, impact, and contact stresses. Ceramic matrix composites provide the means to achieve improved fracture toughness while retaining desirable characteristics, such as high strength and low density. Materials scientists and engineers are trying to develop the ideal fibers and matrices to achieve the optimum ceramic matrix composite properties. A need exists for the development of failure models for the design of ceramic matrix composite heat engine components. Phenomenological failure models are currently the most frequently used in industry, but they are deterministic and do not adequately describe ceramic matrix composite behavior. Semi-empirical models were proposed, which relate the failure of notched composite laminates to the stress a characteristic distance away from the notch. Shear lag models describe composite failure modes at the micromechanics level. The enhanced matrix cracking stress occurs at the same applied stress level predicted by the two models of steady state cracking. Finally, statistical models take into consideration the distribution in composite failure strength. The intent is to develop these models into computer algorithms for the failure analysis of ceramic matrix composites under monotonically increasing loads. The algorithms will be included in a postprocessor to general purpose finite element programs.

  15. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies

    PubMed Central

    Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.

    2016-01-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis. PMID:27274911

  16. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    PubMed

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  17. [Pharmacoeconomic assessment of daptomycin as first-line therapy for bacteraemia and complicated skin and skin structure infections caused by gram-positive pathogens in Spain].

    PubMed

    Grau, S; Rebollo, P; Cuervo, J; Gil-Parrado, S

    2011-09-01

    To assess the efficiency of daptomycin as firstline therapy (D) versus daptomycin as salvage therapy after vancomycin (V→D ) or linezolid (L→D) failure in gram-positive bacteraemia and complicated skin and skin-structure infections (cSSTIs). Cost-effectiveness analysis of 161 bacteraemia and 84 cSSTIs patients comparing the above mentioned therapeutic alternatives was performed using the data from 27 Spanish hospitals involved in the EUCORE study. Direct medical costs were considered. Patients were observed from the first antibiotic dose for infection until either the end of daptomycin therapy or exitus. A multivariate Monte Carlo probabilistic sensitivity analysis was applied for costs (lognormal distribution) and effectiveness (normal distribution). In terms of effectiveness there were no statistical differences between groups but referring total costs per patient, there were significant differences. Sensitivity analysis confirmed that D dominates over L→D between 44.2%-62.1% of simulations in bacteraemia and between 48.2%-67.5% in cSSTIs. In comparison to V→D, D dominance was detected in 29.2%-33.2% of simulations in bacteraemia and between 48.2%-59.3% in cSSTIs. Daptomycin as first-line therapy dominates over daptomycin as salvage therapy after linezolid failure both in bacteraemia and cSSTIs. Comparing daptomycin as first-line therapy with its use after vancomycin failure, in cSSTIs the former is dominant. In bacteremia daptomycin as first line therapy is as effective as daptomycin as salvage therapy after vancomycin failure and implies lower costs.

  18. Motivational Interviewing Tailored Interventions for Heart Failure (MITI-HF): study design and methods.

    PubMed

    Masterson Creber, Ruth; Patey, Megan; Dickson, Victoria Vaughan; DeCesaris, Marissa; Riegel, Barbara

    2015-03-01

    Lack of engagement in self-care is common among patients needing to follow a complex treatment regimen, especially patients with heart failure who are affected by comorbidity, disability and side effects of poly-pharmacy. The purpose of Motivational Interviewing Tailored Interventions for Heart Failure (MITI-HF) is to test the feasibility and comparative efficacy of an MI intervention on self-care, acute heart failure physical symptoms and quality of life. We are conducting a brief, nurse-led motivational interviewing randomized controlled trial to address behavioral and motivational issues related to heart failure self-care. Participants in the intervention group receive home and phone-based motivational interviewing sessions over 90-days and those in the control group receive care as usual. Participants in both groups receive patient education materials. The primary study outcome is change in self-care maintenance from baseline to 90-days. This article presents the study design, methods, plans for statistical analysis and descriptive characteristics of the study sample for MITI-HF. Study findings will contribute to the literature on the efficacy of motivational interviewing to promote heart failure self-care. We anticipate that using an MI approach can help patients with heart failure focus on their internal motivation to change in a non-confrontational, patient-centered and collaborative way. It also affirms their ability to practice competent self-care relevant to their personal health goals. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Analysis of warning letters issued by the US Food and Drug Administration to clinical investigators, institutional review boards and sponsors: a retrospective study.

    PubMed

    Shetty, Yashashri C; Saiyed, Aafreen A

    2015-05-01

    The US Food and Drug Administration (FDA) issues warning letters to all research stakeholders if unacceptable deficiencies are found during site visits. Warning letters issued by the FDA between January 2011 and December 2012 to clinical investigators and institutional review boards (IRBs) were reviewed for various violation themes and compared to similar studies in the past. Warning letters issued to sponsors between January 2005 and December 2012 were analysed for the first time for a specific set of violations using descriptive statistics. Failure to protect subject safety and to report adverse events to IRBs was found to be significant compared to prior studies for clinical investigators, while failure to follow standard operating procedures and maintain documentation was noted as significant in warning letters to IRBs. Failure to maintain minutes of meeting and to follow written procedures for continuing review were new substantial violations in warning letters issued to IRBs. Forty-six warning letters were issued to sponsors, the most common violations being failure to follow a monitoring schedule (58.69%), failure to obtain investigator agreement (34.78%), failure to secure investigators' compliance (30.43%), and failure to maintain data records and ship documents to investigators (30.43%). Appropriate methods for handling clinical trial procedural violations should be developed and implemented worldwide. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Influence of enamel preservation on failure rates of porcelain laminate veneers.

    PubMed

    Gurel, Galip; Sesma, Newton; Calamita, Marcelo A; Coachman, Christian; Morimoto, Susana

    2013-01-01

    The purpose of this study was to evaluate the failure rates of porcelain laminate veneers (PLVs) and the influence of clinical parameters on these rates in a retrospective survey of up to 12 years. Five hundred eighty laminate veneers were bonded in 66 patients. The following parameters were analyzed: type of preparation (depth and margin), crown lengthening, presence of restoration, diastema, crowding, discoloration, abrasion, and attrition. Survival was analyzed using the Kaplan-Meier method. Cox regression modeling was used to determine which factors would predict PLV failure. Forty-two veneers (7.2%) failed in 23 patients, and an overall cumulative survival rate of 86% was observed. A statistically significant association was noted between failure and the limits of the prepared tooth surface (margin and depth). The most frequent failure type was fracture (n = 20). The results revealed no significant influence of crown lengthening apically, presence of restoration, diastema, discoloration, abrasion, or attrition on failure rates. Multivariable analysis (Cox regression model) also showed that PLVs bonded to dentin and teeth with preparation margins in dentin were approximately 10 times more likely to fail than PLVs bonded to enamel. Moreover, coronal crown lengthening increased the risk of PLV failure by 2.3 times. A survival rate of 99% was observed for veneers with preparations confined to enamel and 94% for veneers with enamel only at the margins. Laminate veneers have high survival rates when bonded to enamel and provide a safe and predictable treatment option that preserves tooth structure.

  1. Load to failure of different zirconia implant abutments with titanium components.

    PubMed

    Mascarenhas, Faye; Yilmaz, Burak; McGlumphy, Edwin; Clelland, Nancy; Seidt, Jeremy

    2017-06-01

    Abutments with a zirconia superstructure and a titanium insert have recently become popular. Although they have been tested under static load, their performance under simulated mastication is not well known. The purpose of this in vitro study was to compare the cyclic load to failure of 3 types of zirconia abutments with different mechanisms of retention of the zirconia to the titanium interface. Fifteen implants (n=5 per system) and abutments (3 groups: 5 friction fit [Frft]; 5 bonded; and 5 titanium ring friction fit [Ringfrft]) were used. Abutments were thermocycled in water between 5°C and 55°C for 15000 cycles and then cyclically loaded for 20000 cycles or until failure at a frequency of 2 Hz by using a sequentially increased loading protocol up to a maximum of 720 N. The load to failure for each group was recorded, and 1-way analysis of variance was performed. The mean load-to-failure values for the Frft group was 526 N, for the Bond group 605 N, and for the Ringfrft group 288 N. A statistically significant difference was found among all abutments tested (P<.05). Abutments with the bonded connection showed the highest load-to-failure value, and the abutment with the titanium ring friction fit connection showed the lowest load-to-failure value. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  2. Design and Rationale of the Cognitive Intervention to Improve Memory in Heart Failure Patients Study.

    PubMed

    Pressler, Susan J; Giordani, Bruno; Titler, Marita; Gradus-Pizlo, Irmina; Smith, Dean; Dorsey, Susan G; Gao, Sujuan; Jung, Miyeon

    Memory loss is an independent predictor of mortality among heart failure patients. Twenty-three percent to 50% of heart failure patients have comorbid memory loss, but few interventions are available to treat the memory loss. The aims of this 3-arm randomized controlled trial were to (1) evaluate efficacy of computerized cognitive training intervention using BrainHQ to improve primary outcomes of memory and serum brain-derived neurotrophic factor levels and secondary outcomes of working memory, instrumental activities of daily living, and health-related quality of life among heart failure patients; (2) evaluate incremental cost-effectiveness of BrainHQ; and (3) examine depressive symptoms and genomic moderators of BrainHQ effect. A sample of 264 heart failure patients within 4 equal-sized blocks (normal/low baseline cognitive function and gender) will be randomly assigned to (1) BrainHQ, (2) active control computer-based crossword puzzles, and (3) usual care control groups. BrainHQ is an 8-week, 40-hour program individualized to each patient's performance. Data collection will be completed at baseline and at 10 weeks and 4 and 8 months. Descriptive statistics, mixed model analyses, and cost-utility analysis using intent-to-treat approach will be computed. This research will provide new knowledge about the efficacy of BrainHQ to improve memory and increase serum brain-derived neurotrophic factor levels in heart failure. If efficacious, the intervention will provide a new therapeutic approach that is easy to disseminate to treat a serious comorbid condition of heart failure.

  3. The effect of heart failure nurse consultations on heart failure patients' illness beliefs, mood and quality of life over a six-month period.

    PubMed

    Lucas, Rebecca; Riley, Jillian P; Mehta, Paresh A; Goodman, Helen; Banya, Winston; Mulligan, Kathleen; Newman, Stanton; Cowie, Martin R

    2015-01-01

    To explore the effect contact with a heart failure nurse can have on patients' illness beliefs, mood and quality of life. There is growing interest in patients' illness beliefs and the part they play in a patients understanding of chronic disease. Secondary analysis on two independent datasets. Patients were recruited from five UK hospitals, four in London and one in Sussex. Patients were recruited from an inpatient and outpatient setting. The first dataset recruited 174 patients with newly diagnosed heart failure, whilst the second dataset recruited 88 patients with an existing diagnosis of heart failure. Patients completed the Minnesota Living with Heart Failure Questionnaire, Hospital Anxiety and Depression Scale, Illness Perception Questionnaire and the Treatment Representations Inventory at baseline and six months. We used a linear regression model to assess the association that contact with a heart failure nurse had on mood, illness beliefs and quality of life over a six-month period. Patients who had contact with a heart failure nurse were more satisfied with their treatment and more likely to believe that their heart failure was treatable. Contact with a heart failure nurse did not make a statistically significant difference to mood or quality of life. This study has shown that contact with a heart failure nurse can improve patient satisfaction with treatment decisions but has less influence on a patient's beliefs about their personal control, treatment control and treatment concerns. With appropriate support, skills and training, heart failure nurses could play an important role in addressing individual patient's beliefs. There is a need to further investigate this. Exploring patients' illness beliefs and mood could help to enhance person-centred care. Heart failure nurses would need additional training in the techniques used. © 2014 John Wiley & Sons Ltd.

  4. Polymorphisms of Il-10 (-1082) and RANKL (-438) Genes and the Failure of Dental Implants

    PubMed Central

    Ribeiro, Rodrigo; Melo, Rayanne; Tortamano Neto, Pedro; Vajgel, André

    2017-01-01

    Background. Genetic polymorphisms in certain cytokines and chemokines have been investigated to understand why some individuals display implant flaws despite having few risk factors at the time of implant. Purpose. To investigate the association of genetic polymorphisms in interleukin- (IL-) 10 [-1082 region (A/G)] and RANKL [-438 region (A/G)] with the failure of dental implants. Materials and Methods. This study included 90 partially edentulous male and female patients who were rehabilitated with a total of 245 Straumann dental implants. An implant was considered a failure if any of the following occurred: mobility, persistent subjective complaint, recurrent peri-implant infection with suppuration, continuous radiolucency around the implant, probing depth ≥ 5 mm, and bleeding on probing. Buccal mucosal cells were collected for analysis of RANKL438 and IL-10. Results. The implant success rate in this population was 34.4%. The mutant allele (G) in RANKL had an incidence of 52.3% and mutant allele (A) in IL-10 was observed in 37.8%. No statistically significant difference was detected between the failure of the implant and the genotypes and allelic frequencies. Conclusion. No association was detected between the genetic polymorphisms of RANKL (-438) and IL-10 (-1082) and the failure of dental implants in the population studied. PMID:28348592

  5. Polymorphisms of Il-10 (-1082) and RANKL (-438) Genes and the Failure of Dental Implants.

    PubMed

    Ribeiro, Rodrigo; Melo, Rayanne; Tortamano Neto, Pedro; Vajgel, André; Souza, Paulo Roberto Eleutério; Cimões, Renata

    2017-01-01

    Background . Genetic polymorphisms in certain cytokines and chemokines have been investigated to understand why some individuals display implant flaws despite having few risk factors at the time of implant. Purpose . To investigate the association of genetic polymorphisms in interleukin- (IL-) 10 [-1082 region (A/G)] and RANKL [-438 region (A/G)] with the failure of dental implants. Materials and Methods . This study included 90 partially edentulous male and female patients who were rehabilitated with a total of 245 Straumann dental implants. An implant was considered a failure if any of the following occurred: mobility, persistent subjective complaint, recurrent peri-implant infection with suppuration, continuous radiolucency around the implant, probing depth ≥ 5 mm, and bleeding on probing. Buccal mucosal cells were collected for analysis of RANKL438 and IL-10 . Results . The implant success rate in this population was 34.4%. The mutant allele (G) in RANKL had an incidence of 52.3% and mutant allele (A) in IL-10 was observed in 37.8%. No statistically significant difference was detected between the failure of the implant and the genotypes and allelic frequencies. Conclusion . No association was detected between the genetic polymorphisms of RANKL (-438) and IL-10 (-1082) and the failure of dental implants in the population studied.

  6. Loading-rate-independent delay of catastrophic avalanches in a bulk metallic glass

    DOE PAGES

    Chen, S. H.; Chan, K. C.; Wang, G.; ...

    2016-02-25

    The plastic flow of bulk metallic glasses (BMGs) is characterized by intermittent bursts of avalanches, and this trend results in disastrous failures of BMGs. In the present work, a double-side-notched BMG specimen is designed, which exhibits chaotic plastic flows consisting of several catastrophic avalanches under the applied loading. The disastrous shear avalanches have, then, been delayed by forming a stable plastic-flow stage in the specimens with tailored distances between the bottoms of the notches, where the distribution of a complex stress field is acquired. Differing from the conventional compressive testing results, such a delaying process is independent of loading rate.more » The statistical analysis shows that in the specimens with delayed catastrophic failures, the plastic flow can evolve to a critical dynamics, making the catastrophic failure more predictable than the ones with chaotic plastic flows. Lastly, the findings are of significance in understanding the plastic-flow mechanisms in BMGs and controlling the avalanches in relating solids.« less

  7. Enhanced Component Performance Study. Emergency Diesel Generators 1998–2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroeder, John Alton

    2014-11-01

    This report presents an enhanced performance evaluation of emergency diesel generators (EDGs) at U.S. commercial nuclear power plants. This report evaluates component performance over time using Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES) data from 1998 through 2013 and maintenance unavailability (UA) performance data using Mitigating Systems Performance Index (MSPI) Basis Document data from 2002 through 2013. The objective is to present an analysis of factors that could influence the system and component trends in addition to annual performance trends of failure rates and probabilities. The factors analyzed for the EDG component are the differences in failuresmore » between all demands and actual unplanned engineered safety feature (ESF) demands, differences among manufacturers, and differences among EDG ratings. Statistical analyses of these differences are performed and results showing whether pooling is acceptable across these factors. In addition, engineering analyses were performed with respect to time period and failure mode. The factors analyzed are: sub-component, failure cause, detection method, recovery, manufacturer, and EDG rating.« less

  8. Voltage stress effects on microcircuit accelerated life test failure rates

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.

    1976-01-01

    The applicability of Arrhenius and Eyring reaction rate models for describing microcircuit aging characteristics as a function of junction temperature and applied voltage was evaluated. The results of a matrix of accelerated life tests with a single metal oxide semiconductor microcircuit operated at six different combinations of temperature and voltage were used to evaluate the models. A total of 450 devices from two different lots were tested at ambient temperatures between 200 C and 250 C and applied voltages between 5 Vdc and 15 Vdc. A statistical analysis of the surface related failure data resulted in bimodal failure distributions comprising two lognormal distributions; a 'freak' distribution observed early in time, and a 'main' distribution observed later in time. The Arrhenius model was shown to provide a good description of device aging as a function of temperature at a fixed voltage. The Eyring model also appeared to provide a reasonable description of main distribution device aging as a function of temperature and voltage. Circuit diagrams are shown.

  9. A retrospective study of a modified 1-minute formocresol pulpotomy technique part 1: clinical and radiographic findings.

    PubMed

    Kurji, Zahra A; Sigal, Michael J; Andrews, Paul; Titley, Keith

    2011-01-01

    The purpose of this study was to assess the clinical and radiographic outcomes of a 1-minute application of full-strength Buckley's formocresol with concurrent hemostasis using the medicated cotton pledget in human primary teeth. Using a retrospective chart review, clinical and radiographic data were available for 557 primary molars in 320 patients. Descriptive statistics and survival analysis were used to assess outcomes. Overall clinical success, radiographic success, and cumulative 5-year survival rates were approximately 99%, 90%, and 87%, respectively. Internal root resorption (∼5%) and pulp canal obliteration (∼2%) were the most frequently observed radiographic failures. Thirty-nine teeth were extracted due to clinical and or radiographic failure. Mandibular molars were 6 times more prone to radiographic failure than maxillary molars. Success rates for the modified technique are comparable to techniques that use the 5-minute diluted or full-strength solutions reported in the literature. This 1-minute full-strength formocresol technique is an acceptable alternative to published traditional techniques.

  10. An application of artificial intelligence theory to reconfigurable flight control

    NASA Technical Reports Server (NTRS)

    Handelman, David A.

    1987-01-01

    Artificial intelligence techniques were used along with statistical hpyothesis testing and modern control theory, to help the pilot cope with the issues of information, knowledge, and capability in the event of a failure. An intelligent flight control system is being developed which utilizes knowledge of cause and effect relationships between all aircraft components. It will screen the information available to the pilots, supplement his knowledge, and most importantly, utilize the remaining flight capability of the aircraft following a failure. The list of failure types the control system will accommodate includes sensor failures, actuator failures, and structural failures.

  11. An unjustified benefit: immortal time bias in the analysis of time-dependent events.

    PubMed

    Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg

    2018-02-01

    Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.

  12. Probabilistic finite elements for fatigue and fracture analysis

    NASA Astrophysics Data System (ADS)

    Belytschko, Ted; Liu, Wing Kam

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  13. Probabilistic finite elements for fatigue and fracture analysis

    NASA Technical Reports Server (NTRS)

    Belytschko, Ted; Liu, Wing Kam

    1992-01-01

    Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.

  14. Preventing distal pullout of posterior spine instrumentation in thoracic hyperkyphosis: a biomechanical analysis.

    PubMed

    Sun, Edward; Alkalay, Ron; Vader, David; Snyder, Brian D

    2009-06-01

    An in vitro biomechanical study. Compare the mechanical behavior of 5 different constructs used to terminate dual-rod posterior spinal instrumentation in resisting forward flexion moment. Failure of the distal fixation construct can be a significant problem for patients undergoing surgical treatment for thoracic hyperkyphosis. We hypothesize that augmenting distal pedicle screws with infralaminar hooks or sublaminar cables significantly increases the strength and stiffness of these constructs. Thirty-seven thoracolumbar (T12 to L2) calf spines were implanted with 5 configurations of distal constructs: (1) infralaminar hooks, (2) sublaminar cables, (3) pedicle screws, (4) pedicle screws+infralaminar hooks, and (5) pedicle screws+sublaminar cables. Progressive bending moment was applied to each construct until failure. The mode of failure was noted and the construct's stiffness and failure load determined from the load-displacement curves. Bone density and vertebral dimensions were equivalent among the groups (F=0.1 to 0.9, P>0.05). One-way analysis of covariance (adjusted for differences in density and vertebral dimension) demonstrated that all of the screw-constructs (screw, screw+hook, and screw+cable) exhibited significantly higher stiffness and ultimate failure loads compared with either sublaminar hook or cable alone (P<0.05). The screw+hook constructs (109+/-11 Nm/mm) were significantly stiffer than either screws alone (88+/-17 Nm/mm) or screw+cable (98+/-13 Nm/mm) constructs, P<0.05. Screw+cable construct exhibited significantly higher failure load (1336+/-328 N) compared with screw constructs (1102+/-256 N, P<0.05), whereas not statistically different from the screw+hook construct (1220+/-75 N). The cable and hook constructs failed by laminar fracture, screw construct failed in uniaxial shear (pullout), whereas the screws+(hooks or wires) failed by fracture of caudal vertebral body. Posterior dual rod constructs fixed distally using pedicle screws were stiffer and stronger in resisting forward flexion compared with cables or hooks alone. Augmenting these screws with either infralaminar hooks or sublaminar cables provided additional resistance to failure.

  15. Radiographic failure and rates of re-operation after acromioclavicular joint reconstruction: a comparison of surgical techniques.

    PubMed

    Spencer, H T; Hsu, L; Sodl, J; Arianjam, A; Yian, E H

    2016-04-01

    To compare radiographic failure and re-operation rates of anatomical coracoclavicular (CC) ligament reconstructional techniques with non-anatomical techniques after chronic high grade acromioclavicular (AC) joint injuries. We reviewed chronic AC joint reconstructions within a region-wide healthcare system to identify surgical technique, complications, radiographic failure and re-operations. Procedures fell into four categories: (1) modified Weaver-Dunn, (2) allograft fixed through coracoid and clavicular tunnels, (3) allograft loop coracoclavicular fixation, and (4) combined allograft loop and synthetic cortical button fixation. Among 167 patients (mean age 38.1 years, (standard deviation (sd) 14.7) treated at least a four week interval after injury, 154 had post-operative radiographs available for analysis. Radiographic failure occurred in 33/154 cases (21.4%), with the lowest rate in Technique 4 (2/42 4.8%, p = 0.001). Half the failures occurred by six weeks, and the Kaplan-Meier survivorship at 24 months was 94.4% (95% confidence interval (CI) 79.6 to 98.6) for Technique 4 and 69.9% (95% CI 59.4 to 78.3) for the other techniques when combined. In multivariable survival analysis, Technique 4 had better survival than other techniques (Hazard Ratio 0.162, 95% CI 0.039 to 0.068, p = 0.013). Among 155 patients with a minimum of six months post-operative insurance coverage, re-operation occurred in 9.7% (15 patients). However, in multivariable logistic regression, Technique 4 did not reach a statistically significant lower risk for re-operation (odds ratio 0.254, 95% CI 0.05 to 1.3, p = 0.11). In this retrospective series, anatomical CC ligament reconstruction using combined synthetic cortical button and allograft loop fixation had the lowest rate of radiographic failure. Anatomical coracoclavicular ligament reconstruction using combined synthetic cortical button and allograft loop fixation had the lowest rate of radiographic failure. ©2016 The British Editorial Society of Bone & Joint Surgery.

  16. Time-related patterns of ventricular shunt failure.

    PubMed

    Kast, J; Duong, D; Nowzari, F; Chadduck, W M; Schiff, S J

    1994-11-01

    Proximal obstruction is reported to be the most common cause of ventriculoperitoneal (VP) shunt failure, suggesting that imperfect ventricular catheter placement and inadequate valve mechanisms are major causes. This study retrospectively examined patterns of shunt failure in 128 consecutive patients with symptoms of shunt malfunction over a 2-year period. Factors analyzed included site of failure, time from shunt placement or last revision to failure, age of patient at time of failure, infections, and primary etiology of the hydrocephalus. One hundred of these patients required revisions; 14 revisions were due to infections. In this series there was a higher incidence of distal (43%) than of proximal (35%) failure. The difference was not statistically significant when the overall series was considered; however, when factoring time to failure as a variable, marked differences were noted regardless of the underlying cause of hydrocephalus or the age of the patient. Of the 49 patients needing a shunt revision or replacement within 2 years of the previous operation, 50% had proximal malfunction, 14% distal, and 10% had malfunctions attributable directly to the valve itself. Also, 12 of the 14 infections occurred during this time interval. In sharp contrast, of the 51 patients having shunt failure from 2 to more than 12 years after the previous procedure, 72% had distal malfunction, 21% proximal, and only 6% had a faulty valve or infection. This difference between time to failure for proximal versus distal failures was statistically significant (P < 0.00001 for both Student's t-test and non-parametric Mann-Whitney U-test).(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Impact of Tricuspid Regurgitation on the Success of Atrioventricular Node Ablation for Rate Control in Patients With Atrial Fibrillation: The Node Blast Study.

    PubMed

    Reddy, Yeruva Madhu; Gunda, Sampath; Vallakati, Ajay; Kanmanthareddy, Arun; Pillarisetti, Jayasree; Atkins, Donita; Bommana, Sudharani; Emert, Martin P; Pimentel, Rhea; Dendi, Raghuveer; Berenbom, Loren D; Lakkireddy, Dhanunjaya

    2015-09-15

    Atrioventricular node (AVN) ablation is an effective treatment for symptomatic patients with atrial arrhythmias who are refractory to rhythm and rate control strategies where optimal ventricular rate control is desired. There are limited data on the predictors of failure of AVN ablation. Our objective was to identify the predictors of failure of AVN ablation. This is an observational single-center study of consecutive patients who underwent AVN ablation in a large academic center. Baseline characteristics, procedural variables, and outcomes of AVN ablation were collected. AVN "ablation failure" was defined as resumption of AVN conduction resulting in recurrence of either rapid ventricular response or suboptimal biventricular pacing. A total of 247 patients drug refractory AF who underwent AVN ablation at our center with a mean age of 71 ± 12 years with 46% being males were included. Ablation failure was seen in 11 (4.5%) patients. There were no statistical differences between patients with "ablation failure" versus "ablation success" in any of the baseline clinical variables. Patients with moderate-to-severe tricuspid regurgitation (TR) were much more likely to have ablation failure than those with ablation success (8 [73%] vs 65 [27%]; p = 0.003). All 11 patients with ablation failure had a successful redo procedure, 9 with right and 2 with the left sided approach. On multivariate analysis, presence of moderate-to-severe TR was found to be the only predictor of failure of AVN ablation (odds ratio 9.1, confidence interval 1.99 to 42.22, p = 0.004). In conclusion, moderate-to-severe TR is a strong and independent predictor of failure of AVN ablation. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Incidence and Determinants of Dental Implant Failure: A Review of Electronic Health Records in a U.S. Dental School.

    PubMed

    Hickin, Matthew Parker; Shariff, Jaffer A; Jennette, Philip J; Finkelstein, Joseph; Papapanou, Panos N

    2017-10-01

    The aim of this study was to use electronic health care records (EHRs) to examine retrospectively the incidence of and attributes associated with dental implant failures necessitating implant removal in a large cohort of patients treated in the student clinics of a U.S. dental school over three and a half years. EHRs were searched for all patients who received dental implants between July 1, 2011, and December 31, 2014. Characteristics of patients and implants that were actively removed due to irrevocable failure of any etiology ("failure cohort") during this period were compared to those of all other patients who received dental implants during the same time frame ("reference cohort"). Differences in the frequency distribution of various characteristics between the failure and reference cohorts were compared. Of a total 6,129 implants placed in 2,127 patients during the study period, 179 implants (2.9%) in 120 patients (5.6%) were removed. In the multivariate analysis, presence of a removable (OR=2.86) or fixed temporary prosthesis (OR=3.71) was statistically significantly associated with increased risk for implant failure. In contrast, antibiotic coverage (pre- and post-surgery OR=0.16; post-surgery only OR=0.38) and implants of certain manufacturers were associated with lower risk of implant failure. In this sizeable cohort of patients receiving care in dental student clinics, the review of EHRs facilitated identification of multiple variables associated with implant failure resulting in removal; however, these findings do not suggest causative relationships. The adopted analytical approach can enhance quality assurance measures and may contribute to the identification of true risk factors for dental implant failure.

  19. Risk Factors for Failure of Male Slings and Artificial Urinary Sphincters: Results from a Large Middle European Cohort Study.

    PubMed

    Hüsch, Tanja; Kretschmer, Alexander; Thomsen, Frauke; Kronlachner, Dominik; Kurosch, Martin; Obaje, Alice; Anding, Ralf; Pottek, Tobias; Rose, Achim; Olianas, Roberto; Friedl, Alexander; Hübner, Wilhelm; Homberg, Roland; Pfitzenmaier, Jesco; Grein, Ulrich; Queissert, Fabian; Naumann, Carsten Maik; Schweiger, Josef; Wotzka, Carola; Nyarangi-Dix, Joanne; Hofmann, Torben; Ulm, Kurt; Bauer, Ricarda M; Haferkamp, Axel

    2017-01-01

    We analysed the impact of predefined risk factors: age, diabetes, history of pelvic irradiation, prior surgery for stress urinary incontinence (SUI), prior urethral stricture, additional procedure during SUI surgery, duration of incontinence, ASA-classification and cause for incontinence on failure and complications in male SUI surgery. We retrospectively identified 506 patients with an artificial urinary sphincter (AUS) and 513 patients with a male sling (MS) in a multicenter cohort study. Complication rates were correlated to the risk factors in univariate analysis. Subsequently, a multivariate logistic regression adjusted to the risk factors was performed. A p value <0.05 was considered statistically significant. A history of pelvic irradiation was an independent risk factor for explantation in AUS (p < 0.001) and MS (p = 0.018). Moreover, prior urethral stricture (p = 0.036) and higher ASA-classification (p = 0.039) were positively correlated with explantation in univariate analysis for AUS. Urethral erosion was correlated with prior urethral stricture (p < 0.001) and a history of pelvic irradiation (p < 0.001) in AUS. Furthermore, infection was correlated with additional procedures during SUI surgery in univariate analysis (p = 0.037) in MS. We first identified the correlation of higher ASA-classification and explantation in AUS. Nevertheless, only a few novel risk factors had a significant influence on the failure of MS or AUS. © 2016 S. Karger AG, Basel.

  20. Revision of Environmental Factors for Mil-HDBK-217B

    DTIC Science & Technology

    1980-09-01

    and directions for application have been included. Apprpriate revision sheets to MIL-OMK-217 hee been provided as an appendix to the final report. 7...ef this study. In anticipation of insufficient data for direct statistical analysis ir, r11 categories of the study, it was determined that expert...reliability was alreadv known. The resuilting mode failure rates were too low by a factor of about 8 to 1. This methid was more accurate than Lhe average part

  1. Universal avalanche statistics and triggering close to failure in a mean-field model of rheological fracture

    NASA Astrophysics Data System (ADS)

    Baró, Jordi; Davidsen, Jörn

    2018-03-01

    The hypothesis of critical failure relates the presence of an ultimate stability point in the structural constitutive equation of materials to a divergence of characteristic scales in the microscopic dynamics responsible for deformation. Avalanche models involving critical failure have determined common universality classes for stick-slip processes and fracture. However, not all empirical failure processes exhibit the trademarks of criticality. The rheological properties of materials introduce dissipation, usually reproduced in conceptual models as a hardening of the coarse grained elements of the system. Here, we investigate the effects of transient hardening on (i) the activity rate and (ii) the statistical properties of avalanches. We find the explicit representation of transient hardening in the presence of generalized viscoelasticity and solve the corresponding mean-field model of fracture. In the quasistatic limit, the accelerated energy release is invariant with respect to rheology and the avalanche propagation can be reinterpreted in terms of a stochastic counting process. A single universality class can be defined from such analogy, and all statistical properties depend only on the distance to criticality. We also prove that interevent correlations emerge due to the hardening—even in the quasistatic limit—that can be interpreted as "aftershocks" and "foreshocks."

  2. Reliability and mode of failure of bonded monolithic and multilayer ceramics.

    PubMed

    Alessandretti, Rodrigo; Borba, Marcia; Benetti, Paula; Corazza, Pedro Henrique; Ribeiro, Raissa; Della Bona, Alvaro

    2017-02-01

    To evaluate the reliability of monolithic and multilayer ceramic structures used in the CAD-on technique (Ivoclar), and the mode of failure produced in ceramic structures bonded to a dentin analog material (NEMA-G10). Ceramic specimens were fabricated as follows (n=30): CAD-on- trilayer structure (IPS e.max ZirCAD/IPS e.max Crystall./Connect/IPS e.max CAD); YLD- bilayer structure (IPS e.max ZirCAD/IPS e.max Ceram); LDC- monolithic structure (IPS e.max CAD); and YZW- monolithic structure (Zenostar Zr Translucent). All ceramic specimens were bonded to G10 and subjected to compressive load in 37°C distilled water until the sound of the first crack, monitored acoustically. Failure load (L f ) values were recorded (N) and statistically analyzed using Weibull distribution, Kruskal-Wallis test, and Student-Newman-Keuls test (α=0.05). L f values of CAD-on and YZW structures were statistically similar (p=0.917), but higher than YLD and LDC (p<0.01). Weibull modulus (m) values were statistically similar for all experimental groups. Monolithic structures (LDC and YZW) failed from radial cracks. Failures in the CAD-on and YLD groups showed, predominantly, both radial and cone cracks. Monolithic zirconia (YZW) and CAD-on structures showed similar failure resistance and reliability, but a different fracture behavior. Copyright © 2016 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  3. Vertical ridge augmentation with autogenous bone grafts 3 years after loading: resorbable barriers versus titanium-reinforced barriers. A randomized controlled clinical trial.

    PubMed

    Merli, Mauro; Lombardini, Francesco; Esposito, Marco

    2010-01-01

    To compare the efficacy of two different techniques for vertical bone regeneration at implant placement with particulated autogenous bone at 3 years after loading: resorbable collagen barriers supported by osteosynthesis plates and nonresorbable titanium-reinforced expanded polytetrafluoroethylene barriers. Twenty-two partially edentulous patients requiring vertical bone augmentation were randomly allocated to two treatment groups, each composed of 11 patients. Prosthetic and implant failures, complications, the amount of vertically regenerated bone, and peri-implant marginal bone levels were recorded by independent and blinded assessors. The implant site requiring the most vertical bone regeneration was selected in each patient for bone level assessment. The follow-up time ranged from provisional loading to 3 years after loading. Analysis of covariance and paired t tests were conducted to compare means at the .05 level of significance. No patient dropped out or was excluded at the 3-year follow-up. No prosthetic failures and no implant failures or complications occurred after loading. There was no statistically significant difference in bone loss between the two groups at either 1 year or 3 years. Both groups had gradually lost a statistically significant amount of peri-implant bone at 1 and 3 years (P < .05). After 3 years, patients treated with resorbable barriers had lost a mean of 0.55 mm of bone; patients who had received nonresorbable barriers showed a mean of 0.53 mm of bone loss. Up to 3 years after implant loading, no failures or complications occurred and peri-implant marginal bone loss was minimal. Vertically regenerated bone can be successfully maintained after functional loading.

  4. The Effect of Scale Dependent Discretization on the Progressive Failure of Composite Materials Using Multiscale Analyses

    NASA Technical Reports Server (NTRS)

    Ricks, Trenton M.; Lacy, Thomas E., Jr.; Pineda, Evan J.; Bednarcyk, Brett A.; Arnold, Steven M.

    2013-01-01

    A multiscale modeling methodology, which incorporates a statistical distribution of fiber strengths into coupled micromechanics/ finite element analyses, is applied to unidirectional polymer matrix composites (PMCs) to analyze the effect of mesh discretization both at the micro- and macroscales on the predicted ultimate tensile (UTS) strength and failure behavior. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a PMC tensile specimen that initiates at the repeating unit cell (RUC) level. Three different finite element mesh densities were employed and each coupled with an appropriate RUC. Multiple simulations were performed in order to assess the effect of a statistical distribution of fiber strengths on the bulk composite failure and predicted strength. The coupled effects of both the micro- and macroscale discretizations were found to have a noticeable effect on the predicted UTS and computational efficiency of the simulations.

  5. Axial Length Measurement Failure Rates With Biometers Using Swept-Source Optical Coherence Tomography Compared to Partial-Coherence Interferometry and Optical Low-Coherence Interferometry.

    PubMed

    McAlinden, Colm; Wang, Qinmei; Gao, Rongrong; Zhao, Weiqi; Yu, Ayong; Li, Yu; Guo, Yan; Huang, Jinhai

    2017-01-01

    To compare a new swept-source optical coherence tomography (SSOCT)-based biometer (OA-2000) with the IOLMaster v5.4 (partial-coherence interferometry) and Aladdin (optical low-coherence interferometry) biometers in terms of axial length measurement and failure rate in eyes with cataract. Reliability study. A total of 377 eyes of 210 patients were scanned with the 3 biometers in a random order. For each biometer, the number of unobtainable axial length measurements was recorded and grouped as per the type and severity of cataract based on the Lens Opacities Classification System III by the same experienced ophthalmologist. The Bland-Altman limits-of-agreement (LoA) method was used to assess the agreement in axial length measurements between the 3 biometers. The failure rate was 0 eyes (0%) with the OA-2000, 136 eyes (36.07%) with the IOLMaster, and 51 eyes (13.53%) with the Aladdin. χ 2 analyses indicated a significant difference in failure rate between all 3 devices (P < .001). Logistic regression analysis highlighted a statistically significant trend of higher failure rates with increasing severity of nuclear, cortical, and posterior subcapsular cataracts. Bland-Altman statistics indicated small mean differences and narrow LoA (OA-2000 vs IOLMaster -0.09 to 0.08 mm; OA-2000 vs Aladdin -0.10 to 0.07 mm; IOLMaster vs Aladdin -0.05 to 0.04 mm). The OA-2000, a new SSOCT-based biometer, outperformed both the IOLMaster and Aladdin biometers in very advanced cataracts of various morphologies. The use of SSOCT technology may be the reason for the improved performance of the OA-2000 and may lead to this technology becoming the gold standard for the measurement of axial length. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Comparison of fracture strength and failure mode of different ceramic implant abutments.

    PubMed

    Elsayed, Adham; Wille, Sebastian; Al-Akhali, Majed; Kern, Matthias

    2017-04-01

    The whitish color of zirconia (ZrO 2 ) abutments offers favorable esthetics compared with the grayish color of titanium (Ti) abutments. Nonetheless, ZrO 2 has greater opacity, making it difficult to achieve natural tooth color. Therefore, lithium disilicate (LaT) abutments have been suggested to replace metal abutments. The purpose of this in vitro study was to evaluate the fracture strength and failure mode of single-tooth implant restorations using ZrO 2 and LaT abutments, and to compare them with titanium (Ti) abutments. Five different types of abutments, Ti; ZrO 2 with no metal base; ZrO 2 with a metal base (ZrT); LaT; and LaT combination abutment and crown (LcT) were assembled on 40 Ti implants and restored with LaT crowns. Specimens were subjected to quasistatic loading using a universal testing machine, until the implant-abutment connection failed. As bending of the metal would be considered a clinical failure, the values of force (N) at which the plastic deformation of the metal occurred were calculated, and the rate of deformation was analyzed. Statistical analysis was done using the Mann-Whitney U test (α=.05). Group ZrO 2 revealed the lowest resistance to failure with a mean of 202 ±33 N. Groups ZrT, LaT, and LaC withstood higher forces without fracture or debonding of the ceramic suprastructure, and failure was due to deformation of metal bases, with no statistically significant differences between these groups regarding the bending behavior. Within the limitations of this in vitro study, it was concluded that LaT abutments have the potential to withstand the physiological occlusal forces that occur in the anterior region and that ZrO 2 abutments combined with Ti inserts have much higher fracture strength than pure ZrO 2 abutments. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  7. Biomechanical analysis of tension band fixation for olecranon fracture treatment.

    PubMed

    Kozin, S H; Berglund, L J; Cooney, W P; Morrey, B F; An, K N

    1996-01-01

    This study assessed the strength of various tension band fixation methods with wire and cable applied to simulated olecranon fractures to compare stability and potential failure or complications between the two. Transverse olecranon fractures were simulated by osteotomy. The fracture was anatomically reduced, and various tension band fixation techniques were applied with monofilament wire or multifilament cable. With a material testing machine load displacement curves were obtained and statistical relevance determined by analysis of variance. Two loading modes were tested: loading on the posterior surface of olecranon to simulate triceps pull and loading on the anterior olecranon tip to recreate a potential compressive loading on the fragment during the resistive flexion. All fixation methods were more resistant to posterior loading than to an anterior load. Individual comparative analysis for various loading conditions concluded that tension band fixation is more resilient to tensile forces exerted by the triceps than compressive forces on the anterior olecranon tip. Neither wire passage anterior to the K-wires nor the multifilament cable provided statistically significant increased stability.

  8. Defect design of insulation systems for photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.

    1981-01-01

    A defect-design approach to sizing electrical insulation systems for terrestrial photovoltaic modules is presented. It consists of gathering voltage-breakdown statistics on various thicknesses of candidate insulation films where, for a designated voltage, module failure probabilities for enumerated thickness and number-of-layer film combinations are calculated. Cost analysis then selects the most economical insulation system. A manufacturing yield problem is solved to exemplify the technique. Results for unaged Mylar suggest using fewer layers of thicker films. Defect design incorporates effects of flaws in optimal insulation system selection, and obviates choosing a tolerable failure rate, since the optimization process accomplishes that. Exposure to weathering and voltage stress reduces the voltage-withstanding capability of module insulation films. Defect design, applied to aged polyester films, promises to yield reliable, cost-optimal insulation systems.

  9. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  10. Analysis of First-Time Unsuccessful Attempts on the Certified Nurse Educator Examination.

    PubMed

    Lundeen, John D

    This retrospective analysis examined first-time unsuccessful attempts on the Certified Nurse Educator (CNE) examination from September 2005 through September 2011 (n = 390). There are few studies examining certification within the academic nurse educator role. There is also a lack of evidence to assist nurse educators in understanding those factors that best support success on the CNE exam. Nonexperimental, descriptive, retrospective correlational design using chi-square test of independence and factorial analyses of variance. A statistically significant relationship was found between first-time failure and highest degree obtained and institutional affiliation on the CNE exam. There was no statistically significant effect on mean scores in any of the six content areas measured by the CNE exam as related to highest degree or institutional affiliation. The findings from this study support a previous recommendation for faculty development, experience in the role, and doctoral preparation prior to seeking certification.

  11. Impact of Different Surgeons on Dental Implant Failure.

    PubMed

    Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    To assess the influence of several factors on the prevalence of dental implant failure, with special consideration of the placement of implants by different dental surgeons. This retrospective study is based on 2,670 patients who received 10,096 implants at one specialist clinic. Only the data of patients and implants treated by surgeons who had inserted a minimum of 200 implants at the clinic were included. Kaplan-Meier curves were stratified with respect to the individual surgeon. A generalized estimating equation (GEE) method was used to account for the fact that repeated observations (several implants) were placed in a single patient. The factors bone quantity, bone quality, implant location, implant surface, and implant system were analyzed with descriptive statistics separately for each individual surgeon. A total of 10 surgeons were eligible. The differences between the survival curves of each individual were statistically significant. The multivariate GEE model showed the following variables to be statistically significant: surgeon, bruxism, intake of antidepressants, location, implant length, and implant system. The surgeon with the highest absolute number of failures was also the one who inserted the most implants in sites of poor bone and used turned implants in most cases, whereas the surgeon with the lowest absolute number of failures used mainly modern implants. Separate survival analyses of turned and modern implants stratified for the individual surgeon showed statistically significant differences in cumulative survival. Different levels of failure incidence could be observed between the surgeons, occasionally reaching significant levels. Although a direct causal relationship could not be ascertained, the results of the present study suggest that the surgeons' technique, skills, and/or judgment may negatively influence implant survival rates.

  12. Therapeutic effect comparison of hepatocyte-like cells and bone marrow mesenchymal stem cells in acute liver failure of rats.

    PubMed

    Li, Dongliang; Fan, Jingjing; He, Xiuhua; Zhang, Xia; Zhang, Zhiqiang; Zeng, Zhiyu; Ruan, Mei; Cai, Lirong

    2015-01-01

    To evaluate the therapeutic efficacy of rat bone marrow mesenchymal stem cells (BMSCs) induced into hepatocyte-like cells and of un-induced BMSCs in acute liver failure rats. BMSCs in highly homogenous passage 3 were cultured using the whole bone marrow adherent culture method. Hepatic-related characters were confirmed with morphology, RT-PCR analysis, glycogen staining and albumin (ALB) immunofluorescence assay. Carbon tetrachloride (CCl4) was injected intraperitoneally to establish an acute rat liver failure model. Hepatocyte-like cells or un-induced BMSCs were respectively injected into the models to examine rats' appearance, liver function assay and liver tissue pathology. Hepatocyte-like morphology, higher expression of cytokeratin 18 (CK18) mRNA and ALB protein, and glycogen accumulation were confirmed in the induced BMSCs. The transplanted DAPI-labeled BMSCs were localized in the liver tissue 3-14 days after transplantation. The levels of liver function indicators (AST, ALT, ALP, and TBIL) from transplanted rats were significant decreased and pathology was improved, indicating the recovery of liver function. However, the differences were statistically insignificant. Both hepatocyte-like cells and un-induced BMSCs had a similarly positively therapeutic efficacy on liver regeneration in rat liver failure model.

  13. Advanced chronic kidney disease in non-valvular atrial fibrillation: extending the utility of R2CHADS2 to patients with advanced renal failure.

    PubMed

    Bautista, Josef; Bella, Archie; Chaudhari, Ashok; Pekler, Gerald; Sapra, Katherine J; Carbajal, Roger; Baumstein, Donald

    2015-04-01

    The R2CHADS2 is a new prediction rule for stroke risk in atrial fibrillation (AF) patients wherein R stands for renal risk. However, it was created from a cohort that excluded patients with advanced renal failure (defined as glomerular filtration rate of <30 mL/min). Our study extends the use of R2CHADS2 to patients with advanced renal failure and aims to compare its predictive power against the currently used CHADS and CHA2DS2VaSc. This retrospective cohort study analyzed the 1-year risk for stroke of the 524 patients with AF at Metropolitan Hospital Center. AUC and C statistics were calculated using three groups: (i) the entire cohort including patients with advanced renal failure, (ii) a cohort excluding patients with advanced renal failure and (iii) all patients with GFR < 30 mL/min only. R2CHADS2, as a predictor for stroke risk, consistently performs better than CHADS2 and CHA2DS2VsC in groups 1 and 2. The C-statistic was highest in R2CHADS compared with CHADS or CHADSVASC in group 1 (0.718 versus 0.605 versus 0.602) and in group 2 (0.724 versus 0.584 versus 0.579). However, there was no statistically significant difference in group 3 (0.631 versus 0.629 versus 0.623). Our study supports the utility of R2CHADS2 as a clinical prediction rule for stroke risk in patients with advanced renal failure.

  14. Increased hospital admissions associated with extreme-heat exposure in King County, Washington, 1990-2010

    PubMed Central

    Isaksen, Tania Busch; Yost, Michael G.; Hom, Elizabeth K.; Ren, You; Lyons, Hilary; Fenske, Richard A.

    2016-01-01

    Increased morbidity and mortality have been associated with extreme heat events, particularly in temperate climates. Few epidemiologic studies have considered the impact of extreme heat events on hospitalization rates in the Pacific Northwest region. This study quantifies the historical (May to September 1990-2010) heat-morbidity relationship in the most populous Pacific Northwest County -King County, Washington. A relative risk (RR) analysis was used to explore the association between heat and all non-traumatic hospitalizations on 99th percentile heat days, while a time series analysis using a piece-wise linear model approximation was used to estimate the effect that heat’s intensity has on hospitalizations, adjusted for temporal trends and day of the week. A non-statistically significant 2% [95% CI: 1.02 (0.98, 1.05)] increase in hospitalization risk, on a heat day versus a non-heat day, was noted for all-ages, all non-traumatic causes. When considering the effect heat intensity has on admissions, we found a statistically significant 1.59% (95% CI: 0.9%, 2.29%) increase in admissions per degree increase in humidex above 37.4 °C. Admissions stratified by cause and age produced statistically significant results with both relative risk and time series analyses for nephritis and nephrotic syndromes, acute renal failure and natural heat exposure hospitalizations. This study demonstrates that heat, expressed as humidex, is associated with increased hospital admissions. When stratified by age and cause of admission, the non-elderly (less than 85) age groups experience significant risk for: nephritis and nephrotic syndromes, acute renal failure, natural heat exposure, COPD and asthma hospitalizations. PMID:25719287

  15. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique

    PubMed Central

    Khatoon, Farheen

    2015-01-01

    Background Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. Aim The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. Materials and Methods DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher’s exact test was used to statistically analyse the obtained data. The p-value <0.05 was considered as significant value. Results Total 54 systemic and 62 local complications occurred during three months of analyse and measure phase. Syncope, failure of anaesthesia, trismus, auto mordeduras and pain at injection site was found to be most recurring complications. Cumulative defective percentage was 7.99 in case of pre-improved data and decreased to 4.58 in the control phase. Estimate for difference was 0.0341228 and 95% lower bound for difference was 0.0193966. p-value was found to be highly significant with p= 0.000. Conclusion The application of six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction. PMID:26816989

  16. Analysis of in vitro fertilization data with multiple outcomes using discrete time-to-event analysis

    PubMed Central

    Maity, Arnab; Williams, Paige; Ryan, Louise; Missmer, Stacey; Coull, Brent; Hauser, Russ

    2014-01-01

    In vitro fertilization (IVF) is an increasingly common method of assisted reproductive technology. Because of the careful observation and followup required as part of the procedure, IVF studies provide an ideal opportunity to identify and assess clinical and demographic factors along with environmental exposures that may impact successful reproduction. A major challenge in analyzing data from IVF studies is handling the complexity and multiplicity of outcome, resulting from both multiple opportunities for pregnancy loss within a single IVF cycle in addition to multiple IVF cycles. To date, most evaluations of IVF studies do not make use of full data due to its complex structure. In this paper, we develop statistical methodology for analysis of IVF data with multiple cycles and possibly multiple failure types observed for each individual. We develop a general analysis framework based on a generalized linear modeling formulation that allows implementation of various types of models including shared frailty models, failure specific frailty models, and transitional models, using standard software. We apply our methodology to data from an IVF study conducted at the Brigham and Women’s Hospital, Massachusetts. We also summarize the performance of our proposed methods based on a simulation study. PMID:24317880

  17. Antibiotics for exacerbations of chronic obstructive pulmonary disease.

    PubMed

    Vollenweider, Daniela J; Jarrett, Harish; Steurer-Stey, Claudia A; Garcia-Aymerich, Judith; Puhan, Milo A

    2012-12-12

    Many patients with an exacerbation of chronic obstructive pulmonary disease (COPD) are treated with antibiotics. However, the value of antibiotics remains uncertain as systematic reviews and clinical trials have shown conflicting results. To assess the effects of antibiotics in the management of acute COPD exacerbations on treatment failure as observed between seven days and one month after treatment initiation (primary outcome) and on other patient-important outcomes (mortality, adverse events, length of hospital stay). We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and other electronically available databases up to September 2012. Randomised controlled trials (RCTs) in people with acute COPD exacerbations comparing antibiotic therapy and placebo with a follow-up of at least seven days. Two review authors independently screened references and extracted data from trial reports. We kept the three groups of outpatients, inpatients and patients admitted to the intensive care unit (ICU) separate for benefit outcomes and mortality because we considered them to be clinically too different to be summarised in one group. We considered outpatients to have a mild to moderate exacerbation, inpatients to have a severe exacerbation and ICU patients to have a very severe exacerbation. Where outcomes or study details were not reported we requested missing data from the authors of the primary studies. We calculated pooled risk ratios (RR) for treatment failure, Peto odds ratios (OR) for rare events (mortality and adverse events) and weighted mean differences (MD) for continuous outcomes using fixed-effect models. We used GRADE to assess the quality of the evidence. Sixteen trials with 2068 participants were included. In outpatients (mild to moderate exacerbations), there was evidence of low quality that antibiotics did statistically significantly reduce the risk for treatment failure between seven days and one month after treatment initiation (RR 0.75; 95% CI 0.60 to 0.94; I(2) = 35%) but they did not significantly reduce the risk when the meta-analysis was restricted to currently available drugs (RR 0.80; 95% CI 0.63 to 1.01; I(2) = 33%). Evidence of high quality showed that antibiotics statistically significantly reduced the risk of treatment failure in inpatients with severe exacerbations (ICU not included) (RR 0.77; 95% CI 0.65 to 0.91; I(2) = 47%) regardless of whether restricted to current drugs. The only trial with 93 patients admitted to the ICU showed a large and statistically significant effect on treatment failure (RR 0.19; 95% CI 0.08 to 0.45; high-quality evidence).Evidence of low-quality from four trials in inpatients showed no effect of antibiotics on mortality (Peto OR 1.02; 95% CI 0.37 to 2.79). High-quality evidence from one trial showed a statistically significant effect on mortality in ICU patients (Peto OR 0.21; 95% CI 0.06 to 0.72). Length of hospital stay (in days) was similar in the antibiotics and placebo groups except for the ICU study where antibiotics statistically significantly reduced length of hospital stay (mean difference -9.60 days; 95% CI -12.84 to -6.36 days). One trial showed no effect of antibiotics on re-exacerbations between two and six weeks after treatment initiation. Only one trial (N = 35) reported health-related quality of life but did not show a statistically significant difference between the treatment and control group.Evidence of moderate quality showed that the overall incidence of adverse events was higher in the antibiotics groups (Peto OR 1.53; 95% CI 1.03 to 2.27). Patients treated with antibiotics experienced statistically significantly more diarrhoea based on three trials (Peto OR 2.62; 95% CI 1.11 to 6.17; high-quality evidence). Antibiotics for COPD exacerbations showed large and consistent beneficial effects across outcomes of patients admitted to an ICU. However, for outpatients and inpatients the results were inconsistent. The risk for treatment failure was significantly reduced in both inpatients and outpatients when all trials (1957 to 2012) were included but not when the analysis for outpatients was restricted to currently used antibiotics. Also, antibiotics had no statistically significant effect on mortality and length of hospital stay in inpatients and almost no data on patient-reported outcomes exist. These inconsistent effects call for research into clinical signs and biomarkers that help identify patients who benefit from antibiotics and patients who experience no effect, and in whom downsides of antibiotics (side effects, costs and multi-resistance) could be avoided.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    PubMed

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  19. Daily remote monitoring of implantable cardioverter-defibrillators: insights from the pooled patient-level data from three randomized controlled trials (IN-TIME, ECOST, TRUST).

    PubMed

    Hindricks, Gerhard; Varma, Niraj; Kacet, Salem; Lewalter, Thorsten; Søgaard, Peter; Guédon-Moreau, Laurence; Proff, Jochen; Gerds, Thomas A; Anker, Stefan D; Torp-Pedersen, Christian

    2017-06-07

    Remote monitoring of implantable cardioverter-defibrillators may improve clinical outcome. A recent meta-analysis of three randomized controlled trials (TRUST, ECOST, IN-TIME) using a specific remote monitoring system with daily transmissions [Biotronik Home Monitoring (HM)] demonstrated improved survival. We performed a patient-level analysis to verify this result with appropriate time-to-event statistics and to investigate further clinical endpoints. Individual data of the TRUST, ECOST, and IN-TIME patients were pooled to calculate absolute risks of endpoints at 1-year follow-up for HM vs. conventional follow-up. All-cause mortality analysis involved all three trials (2405 patients). Other endpoints involved two trials, ECOST and IN-TIME (1078 patients), in which an independent blinded endpoint committee adjudicated the underlying causes of hospitalizations and deaths. The absolute risk of death at 1 year was reduced by 1.9% in the HM group (95% CI: 0.1-3.8%; P = 0.037), equivalent to a risk ratio of 0.62. Also the combined endpoint of all-cause mortality or hospitalization for worsening heart failure (WHF) was significantly reduced (by 5.6%; P = 0.007; risk ratio 0.64). The composite endpoint of all-cause mortality or cardiovascular (CV) hospitalization tended to be reduced by a similar degree (4.1%; P = 0.13; risk ratio 0.85) but without statistical significance. In a pooled analysis of the three trials, HM reduced all-cause mortality and the composite endpoint of all-cause mortality or WHF hospitalization. The similar magnitudes of absolute risk reductions for WHF and CV endpoints suggest that the benefit of HM is driven by the prevention of heart failure exacerbation.

  20. Step-stress analysis for predicting dental ceramic reliability

    PubMed Central

    Borba, Márcia; Cesar, Paulo F.; Griggs, Jason A.; Bona, Álvaro Della

    2013-01-01

    Objective To test the hypothesis that step-stress analysis is effective to predict the reliability of an alumina-based dental ceramic (VITA In-Ceram AL blocks) subjected to a mechanical aging test. Methods Bar-shaped ceramic specimens were fabricated, polished to 1µm finish and divided into 3 groups (n=10): (1) step-stress accelerating test; (2) flexural strength- control; (3) flexural strength- mechanical aging. Specimens from group 1 were tested in an electromagnetic actuator (MTS Evolution) using a three-point flexure fixture (frequency: 2Hz; R=0.1) in 37°C water bath. Each specimen was subjected to an individual stress profile, and the number of cycles to failure was recorded. A cumulative damage model with an inverse power law lifetime-stress relation and Weibull lifetime distribution were used to fit the fatigue data. The data were used to predict the stress level and number of cycles for mechanical aging (group 3). Groups 2 and 3 were tested for three-point flexural strength (σ) in a universal testing machine with 1.0 s in 37°C water. Data were statistically analyzed using Mann-Whitney Rank Sum test. Results Step-stress data analysis showed that the profile most likely to weaken the specimens without causing fracture during aging (95% CI: 0–14% failures) was: 80 MPa stress amplitude and 105 cycles. The median σ values (MPa) for groups 2 (493±54) and 3 (423±103) were statistically different (p=0.009). Significance The aging profile determined by step-stress analysis was effective to reduce alumina ceramic strength as predicted by the reliability estimate, confirming the study hypothesis. PMID:23827018

  1. Step-stress analysis for predicting dental ceramic reliability.

    PubMed

    Borba, Márcia; Cesar, Paulo F; Griggs, Jason A; Della Bona, Alvaro

    2013-08-01

    To test the hypothesis that step-stress analysis is effective to predict the reliability of an alumina-based dental ceramic (VITA In-Ceram AL blocks) subjected to a mechanical aging test. Bar-shaped ceramic specimens were fabricated, polished to 1μm finish and divided into 3 groups (n=10): (1) step-stress accelerating test; (2) flexural strength-control; (3) flexural strength-mechanical aging. Specimens from group 1 were tested in an electromagnetic actuator (MTS Evolution) using a three-point flexure fixture (frequency: 2Hz; R=0.1) in 37°C water bath. Each specimen was subjected to an individual stress profile, and the number of cycles to failure was recorded. A cumulative damage model with an inverse power law lifetime-stress relation and Weibull lifetime distribution were used to fit the fatigue data. The data were used to predict the stress level and number of cycles for mechanical aging (group 3). Groups 2 and 3 were tested for three-point flexural strength (σ) in a universal testing machine with 1.0MPa/s stress rate, in 37°C water. Data were statistically analyzed using Mann-Whitney Rank Sum test. Step-stress data analysis showed that the profile most likely to weaken the specimens without causing fracture during aging (95% CI: 0-14% failures) was: 80MPa stress amplitude and 10(5) cycles. The median σ values (MPa) for groups 2 (493±54) and 3 (423±103) were statistically different (p=0.009). The aging profile determined by step-stress analysis was effective to reduce alumina ceramic strength as predicted by the reliability estimate, confirming the study hypothesis. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  2. A 12 month clinical study of bond failures of recycled versus new stainless steel orthodontic brackets.

    PubMed

    Cacciafesta, Vittorio; Sfondrini, Maria Francesca; Melsen, Birte; Scribante, Andrea

    2004-08-01

    The purpose of this prospective longitudinal randomized study was to compare the clinical performance of recycled brackets with that of new stainless steel brackets (Orthos). Twenty patients treated with fixed appliances were included in the investigation. Using a 'split-mouth' design, the dentition of each patient was divided into four quadrants. In 11 randomly selected patients, the maxillary left and mandibular right quadrants were bonded with recycled brackets, and the remaining quadrants with new stainless steel brackets. In the other nine patients the quadrants were inverted. Three hundred and ten stainless steel brackets were examined: 156 were recycled and the remaining 154 were new. All the brackets were bonded with a self-cured resin-modified glass ionomer (GC Fuji Ortho). The number, cause, and date of bracket failures were recorded over 12 months. Statistical analysis was performed by means of a paired t-test, Kaplan-Meier survival estimates, and the log-rank test. No statistically significant differences were found between: (a) the total bond failure rate of recycled and new stainless steel brackets; (b) the upper and lower arches; (c) the anterior and posterior segments. These findings demonstrate that recycling metallic orthodontic brackets can be of benefit to the profession, both economically and ecologically, as long as the orthodontist is aware of the various aspects of the recycling methods, and that patients are informed about the type of bracket that will be used for their treatment.

  3. Emergent irreversibility and entanglement spectrum statistics

    NASA Astrophysics Data System (ADS)

    Mucciolo, Eduardo; Chamon, Claudio; Hamma, Alioscia

    2014-03-01

    We study the problem of irreversibility when the dynamical evolution of a many-body system is described by a stochastic quantum circuit. Such evolution is more general than Hamitonian, and since energy levels are not well defined, the well-established connection between the statistical fluctuations of the energy spectrum and irreversibility cannot be made. We show that the entanglement spectrum provides a more general connection. Irreversibility is marked by a failure of a disentangling algorithm and is preceded by the appearance of Wigner-Dyson statistical fluctuations in the entanglement spectrum. This analysis can be done at the wavefunction level and offers a new route to study quantum chaos and quantum integrability. We acknowledge financial support from the U.S. National Science Foundation through grants CCF 1116590 and CCF 1117241, from the National Basic Research Program of China through grants 2011CBA00300 and 2011CBA00301, and from the National Natural Science Fo.

  4. Project #153M: Guidance for Assessing the Remaining Strength of Corroded Pipelines

    DOT National Transportation Integrated Search

    2010-04-01

    Incident statistics have consistently shown that corrosion is the primary cause of pipeline failures in liquid pipelines, and is the second largest cause of failures in natural gas transmission pipelines and distribution piping. Corrosion can cause m...

  5. High-sensitivity cardiac troponin I and risk of heart failure in patients with suspected acute coronary syndrome: a cohort study.

    PubMed

    Stelzle, Dominik; Shah, Anoop S V; Anand, Atul; Strachan, Fiona E; Chapman, Andrew R; Denvir, Martin A; Mills, Nicholas L; McAllister, David A

    2018-01-01

    Heart failure may occur following acute myocardial infarction, but with the use of high-sensitivity cardiac troponin assays we increasingly diagnose patients with minor myocardial injury. Whether troponin concentrations remain a useful predictor of heart failure in patients with acute coronary syndrome is uncertain. We identified all consecutive patients (n = 4748) with suspected acute coronary syndrome (61 ± 16 years, 57% male) presenting to three secondary and tertiary care hospitals. Cox-regression models were used to evaluate the association between high-sensitivity cardiac troponin I concentration and subsequent heart failure hospitalization. C-statistics were estimated to evaluate the predictive value of troponin for heart failure hospitalization. Over 2071 years of follow-up there were 83 heart failure hospitalizations. Patients with troponin concentrations above the upper reference limit (URL) were more likely to be hospitalized with heart failure than patients below the URL (118/1000 vs. 17/1000 person years, adjusted hazard ratio: 7.0). Among patients with troponin concentrations

  6. Trifactorial classification system for osteotome sinus floor elevation based on an observational retrospective analysis of 926 implants followed up to 10 years.

    PubMed

    French, David; Nadji, Nabil; Liu, Shawn X; Larjava, Hannu

    2015-06-01

    A novel osteotome trifactorial classification system is proposed for transcrestal osteotome-mediated sinus floor elevation (OSFE) sites that includes residual bone height (RBH), sinus floor anatomy (contour), and multiple versus single sites OSFE (tenting). An analysis of RBH, contour, and tenting was retrospectively applied to a cohort of 926 implants placed using OSFE without added bone graft and followed up to 10 years. RBH was divided into three groups: high (RBH > 6 mm), mid (RBH = 4.1 to 6 mm), and low (RBH = 2 to 4 mm). The sinus "contour" was divided into four groups: flat, concave, angle, and septa. For "tenting", single versus multiple adjacent OSFE sites were compared. The prevalence of flat sinus floors increased as RBH decreased. RBH was a significant predictor of failure with rates as follows: low- RBH = 5.1%, mid-RBH = 1.5%, and high-RBH = 0.4%. Flat sinus floors and single sites as compared to multiple sites had higher observed failure rates but neither achieved statistical significance; however, the power of the study was limited by low numbers of failures. The osteotome trifactorial classification system as proposed can assist planning OSFE cases and may allow better comparison of future OSFE studies.

  7. Long-term effectiveness of telephone-based health coaching for heart failure patients: A post-only randomised controlled trial.

    PubMed

    Tiede, Michel; Dwinger, Sarah; Herbarth, Lutz; Härter, Martin; Dirmaier, Jörg

    2017-09-01

    Introduction The * Equal contributors. health-status of heart failure patients can be improved to some extent by disease self-management. One method of developing such skills is telephone-based health coaching. However, the effects of telephone-based health coaching remain inconclusive. The aim of this study was to evaluate the effects of telephone-based health coaching for people with heart failure. Methods A total sample of 7186 patients with various chronic diseases was randomly assigned to either the coaching or the control group. Then 184 patients with heart failure were selected by International Classification of Diseases (ICD)-10 code for subgroup analysis. Data were collected at 24 and 48 months after the beginning of the coaching. The primary outcome was change in quality of life. Secondary outcomes were changes in depression and anxiety, health-related control beliefs, control preference, health risk behaviour and health-related behaviours. Statistical analyses included a per-protocol evaluation, employing analysis of variance and analysis of covariance (ANCOVA) as well as Mann-Whitney U tests. Results Participants' average age was 73 years (standard deviation (SD) = 9) and the majority were women (52.8%). In ANCOVA analyses there were no significant differences between groups for the change in quality of life (QoL) . However, the coaching group reported a significantly higher level of physical activity ( p = 0.03), lower intake of non-prescribed drugs ( p = 0.04) and lower levels of stress ( p = 0.02) than the control group. Mann-Whitney U tests showed a different external locus of control ( p = 0.014), and higher reduction in unhealthy nutrition ( p = 0.019), physical inactivity ( p = 0.004) and stress ( p = 0.028). Discussion Our results suggest that telephone-based health coaching has no effect on QoL, anxiety and depression of heart failure patients, but helps in improving certain risk behaviours and changes the locus of control to be more externalised.

  8. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  9. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.

  10. High-Flow Nasal Cannula Oxygenation in Immunocompromised Patients With Acute Hypoxemic Respiratory Failure: A Groupe de Recherche Respiratoire en Réanimation Onco-Hématologique Study.

    PubMed

    Lemiale, Virginie; Resche-Rigon, Matthieu; Mokart, Djamel; Pène, Frédéric; Argaud, Laurent; Mayaux, Julien; Guitton, Christophe; Rabbat, Antoine; Girault, Christophe; Kouatchet, Achille; Vincent, François; Bruneel, Fabrice; Nyunga, Martine; Seguin, Amélie; Klouche, Kada; Colin, Gwenahel; Kontar, Loay; Perez, Pierre; Meert, Anne-Pascale; Benoit, Dominique D; Papazian, Laurent; Demoule, Alexandre; Chevret, Sylvie; Azoulay, Elie

    2017-03-01

    In immunocompromised patients with acute respiratory failure, invasive mechanical ventilation remains associated with high mortality. Choosing the adequate oxygenation strategy is of the utmost importance in that setting. High-flow nasal oxygen has recently shown survival benefits in unselected patients with acute respiratory failure. The objective was to assess outcomes of immunocompromised patients with hypoxemic acute respiratory failure treated with high-flow nasal oxygen. We performed a post hoc analysis of a randomized controlled trial of noninvasive ventilation in critically ill immunocompromised patients with hypoxemic acute respiratory failure. Twenty-nine ICUs in France and Belgium. Critically ill immunocompromised patients with hypoxemic acute respiratory failure. A propensity score-based approach was used to assess the impact of high-flow nasal oxygen compared with standard oxygen on day 28 mortality. Among 374 patients included in the study, 353 met inclusion criteria. Underlying disease included mostly malignancies (n = 296; 84%). Acute respiratory failure etiologies were mostly pneumonia (n = 157; 44.4%) or opportunistic infection (n = 76; 21.5%). Noninvasive ventilation was administered to 180 patients (51%). Invasive mechanical ventilation was ultimately needed in 142 patients (40.2%). Day 28 mortality was 22.6% (80 deaths). Throughout the ICU stay, 127 patients (36%) received high-flow nasal oxygen whereas 226 patients received standard oxygen. Ninety patients in each group (high-flow nasal oxygen or standard oxygen) were matched according to the propensity score, including 91 of 180 (51%) who received noninvasive ventilation. High-flow nasal oxygen was neither associated with a lower intubation rate (hazard ratio, 0.42; 95% CI, 0.11-1.61; p = 0.2) nor day 28 mortality (hazard ratio, 0.80; 95% CI, 0.45-1.42; p = 0.45). In immunocompromised patients with hypoxemic acute respiratory failure, high-flow nasal oxygen when compared with standard oxygen did not reduce intubation or survival rates. However, these results could be due to low statistical power or unknown confounders associated with the subgroup analysis. A randomized trial is needed.

  11. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE PAGES

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    2017-12-20

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  12. Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.

    In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.

  13. Conversion of Questionnaire Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less

  14. Treatment of esophageal anastomotic leakage with self-expanding metal stents: analysis of risk factors for treatment failure

    PubMed Central

    Persson, Saga; Rouvelas, Ioannis; Kumagai, Koshi; Song, Huan; Lindblad, Mats; Lundell, Lars; Nilsson, Magnus; Tsai, Jon A.

    2016-01-01

    Background and study aim: The endoscopic placement of self-expandable metallic esophageal stents (SEMS) has become the preferred primary treatment for esophageal anastomotic leakage in many institutions. The aim of this study was to investigate possible risk factors for failure of SEMS-based therapy in patients with esophageal anastomotic leakage. Patients and methods: Beginning in 2003, all patients with an esophageal leak were initially approached and assessed for temporary closure with a SEMS. Until 2014, all patients at the Karolinska University Hospital with a leak from an esophagogastric or esophagojejunal anastomosis were identified. Data regarding the characteristics of the patients and leaks and the treatment outcomes were compiled. Failure of the SEMS treatment strategy was defined as death due to the leak or a major change in management strategy. The risk factors for treatment failure were analyzed with simple and multivariable logistic regression statistics. Results: A total of 447 patients with an esophagogastric or esophagojejunal anastomosis were identified. Of these patients, 80 (18 %) had an anastomotic leak, of whom 46 (58 %) received a stent as first-line treatment. In 29 of these 46 patients, the leak healed without any major change in treatment strategy. Continuous leakage after the application of a stent, decreased physical performance preoperatively, and concomitant esophagotracheal fistula were identified as independent risk factors for failure with multivariable logistic regression analysis. Conclusion: Stent treatment for esophageal anastomotic leakage is successful in the majority of cases. Continuous leakage after initial stent insertion, decreased physical performance preoperatively, and the development of an esophagotracheal fistula decrease the probability of successful treatment. PMID:27092321

  15. Meta-analysis of Timing for Microsurgical Free-Flap Reconstruction for Lower Limb Injury: Evaluation of the Godina Principles.

    PubMed

    Haykal, Siba; Roy, Mélissa; Patel, Ashit

    2018-05-01

     In 1986, Marko Godina published his seminal work regarding the timing of free-flap reconstruction for traumatic extremity defects. Early reconstruction, compared with delayed and late reconstruction resulted in significant decreases in free-flap failure rate, post-operative infections, hospitalization time, bone healing time, and number of additional anesthesias. The objective of this manuscript was to evaluate whether these principles continue to apply.  A meta-analysis was performed analyzing articles from Medline, Embase, and Pubmed. Four hundred and ninety-two articles were screened, and 134 articles were assessed for eligibility. Following full-text review, 43 articles were included in this study.  The exact timing for free-flap reconstruction, free-flap failure rate, infection rate, and follow-up was defined in all 43 articles. Early free-flap reconstruction was found to have significantly lower rates of free-flap failure and infection in comparison to delayed reconstruction ( p  = 0.008; p  = 0.0004). Compared with late reconstruction, early reconstruction was found to have significantly lower infection rates only ( p  = 0.01) with no difference in free-flap failures rates. Early reconstruction was found to lead to fewer additional procedures ( p  = 0.03). No statistical significance was found for bone healing time or hospitalization time.  Early free-flap reconstruction performed within the first 72 hours resulted in a decreased rate of free-flap failures, infection, and additional procedures with no difference in other parameters. The largest majority of free flaps continue to be performed in a delayed time frame. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  16. Risk of heart failure and edema associated with the use of pregabalin: a systematic review.

    PubMed

    Ho, Joanne M; Tricco, Andrea C; Perrier, Laure; Chen, Maggie; Juurlink, David N; Straus, Sharon E

    2013-05-04

    Pregabalin is used in the treatment of postherpetic neuralgia, diabetic neuropathic pain, partial seizures, anxiety disorders and fibromyalgia. Recognized adverse effects associated with its use include cognitive impairment, somnolence and dizziness. Heart failure associated with pregabalin has been described, however the strength of this association has not been well characterized. To examine this further, we will conduct a systematic review of the risk of heart failure and edema associated with use of pregabalin. We will include all studies (experimental, quasi-experimental, observational, case series/reports, drug regulatory reports) that examine the use of pregabalin compared to placebo, gabapentin or conventional care. Our primary outcome is heart failure and the secondary outcomes include edema and weight gain. We will search electronic databases (MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials), and grey literature sources (trial registries, conference abstracts) to identify relevant studies. To ensure literature saturation, we will contact drug manufacturers, conduct forward citation searching, and scan the reference lists of key articles and included studies. We will not restrict inclusion by language or publication status.Two reviewers will screen citations (titles and abstracts) and full-text articles, conduct data abstraction, and appraise risk of bias. Random-effects meta-analysis will be conducted if the studies are deemed heterogeneous in terms of clinical, statistical and methodological factors but still suitable for meta-analysis. The results of this review will assist physicians to better appreciate pregabalin's risk for edema or congestive heart failure and will be pertinent to the thousands of patients worldwide who are administered this medication.Our protocol was registered in the PROSPERO database (CRD42012002948).

  17. Coherent changes of multifractal properties of continuous acoustic emission at failure of heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Panteleev, Ivan; Bayandin, Yuriy; Naimark, Oleg

    2017-12-01

    This work performs a correlation analysis of the statistical properties of continuous acoustic emission recorded in different parts of marble and fiberglass laminate samples under quasi-static deformation. A spectral coherent measure of time series, which is a generalization of the squared coherence spectrum on a multidimensional series, was chosen. The spectral coherent measure was estimated in a sliding time window for two parameters of the acoustic emission multifractal singularity spectrum: the spectrum width and the generalized Hurst exponent realizing the maximum of the singularity spectrum. It is shown that the preparation of the macrofracture focus is accompanied by the synchronization (coherent behavior) of the statistical properties of acoustic emission in allocated frequency intervals.

  18. Tumor Necrosis Factor Inhibitor Primary Failure Predicts Decreased Ustekinumab Efficacy in Psoriasis Patients.

    PubMed

    Sorensen, Eric P; Fanucci, Kristina A; Saraiya, Ami; Volf, Eva; Au, Shiu-chung; Argobi, Yahya; Mansfield, Ryan; Gottlieb, Alice B

    2015-08-01

    Additional studies are needed to examine the efficacy of ustekinumab in psoriasis patients who have previously been exposed to tumor necrosis factor inhibitors (TNFi). To examine the predictive effect of TNFi primary failure and the number of TNFi exposures on the efficacy of ustekinumab in psoriasis treatment. This retrospective study examined 44 psoriasis patients treated at the Tufts Medical Center Department of Dermatology between January 2008 and July 2014. Patients were selected if they were treated with ustekinumab and had ≥ 1 previous TNFi exposure. The following subgroups were compared: patients with vs without a previous TNFi primary failure, and patients with one vs multiple previous TNFi exposures. The efficacy measure used was the previously validated Simple Measure for Assessing Psoriasis Activity (S-MAPA), which is calculated by the product of the body surface area and physician global assessment. The primary outcome was the percentage improvement S-MAPA from course baseline at week 12 of ustekinumab treatment. Secondary outcomes were the psoriasis clearance, primary failure, and secondary failure rates with ustekinumab treatment. Patients with a previous TNFi primary failure had a significantly lower percentage improvement in S-MAPA score at week 12 of ustekinumab treatment compared with patients without TNFi primary failure (36.2% vs 61.1%, P=.027). Multivariate analysis demonstrated that this relationship was independent of patient demographics and medical comorbidities. Patients with multiple TNFi exposures had a non-statistically significant lower percentage S-MAPA improvement at week 12 (40.5% vs 52.9%, P=.294) of ustekinumab treatment compared with patients with a single TNFi exposure. Among psoriasis patients previously exposed to TNFi, a history of a previous TNFi primary failure predicts a decreased response to ustekinumab independent of patient demographics and medical comorbidities.

  19. A statistical approach to nuclear fuel design and performance

    NASA Astrophysics Data System (ADS)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance, with an average sensitivity index of 48.93% on key output quantities. Pellet grain size and dish depth are also significant contributors, at 31.53% and 13.46%, respectively. A traditional limit of operating envelope case is also evaluated. This case produces output values that exceed the maximum values observed during the 105 Monte Carlo trials for all output quantities of interest. In many cases the difference between the predictions of the two methods is very prominent, and the highly conservative nature of the deterministic approach is demonstrated. A reliability analysis of CANDU fuel manufacturing parametric data, specifically pertaining to the quantification of fuel performance margins, has not been conducted previously. Key Words: CANDU, nuclear fuel, Cameco, fuel manufacturing, fuel modelling, fuel performance, fuel reliability, ELESTRES, ELOCA, dimensional reduction methods, global sensitivity analysis, deterministic safety analysis, probabilistic safety analysis.

  20. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  1. Dose-response effects for disease management programs on hospital utilization in Illinois Medicaid.

    PubMed

    Berg, Gregory D; Donnelly, Shawn; Miller, Mary; Medina, Wendie; Warnick, Kathleen

    2012-12-01

    The objective of this study is to estimate a dose-response impact of disease management contacts on inpatient admissions. Multivariate regression analysis of panel data was used to test the hypothesis that increased disease management contacts lower the odds of an inpatient admission. Subjects were 40,452 members of Illinois' noninstitutionalized Medicaid-only aged, blind, or disabled population diagnosed with asthma, coronary artery disease, chronic obstructive pulmonary disease, diabetes, and/or heart failure. All members are also in the state's Illinois Health Connect program, a medical home strategy in place for most of the 2.4 million Illinois Medicaid beneficiaries. The statistical measure is the odds ratio, which is a measure of association between the monthly inpatient admission indicator and the number of contacts (doses) a member has had for each particular disease management intervention. Statistically significant contacts are between 8 and 12 for heart failure, between 4 and 12 contacts for diabetes, and between 8 and 13 contacts for asthma. Total inpatient savings during the study period is estimated to be $12.4 million. This study shows the dose-response pattern of inpatient utilization improvements through the number of disease management contacts.

  2. Use of Statistical Analysis of Acoustic Emission Data on Carbon-Epoxy COPV Materials-of-Construction for Enhanced Felicity Ratio Onset Determination

    NASA Technical Reports Server (NTRS)

    Abraham, Arick Reed A.; Johnson, Kenneth L.; Nichols, Charles T.; Saulsberry, Regor L.; Waller, Jess M.

    2012-01-01

    Broadband modal acoustic emission (AE) data were acquired during intermittent load hold tensile test profiles on Toray T1000G carbon fiber-reinforced epoxy (C/Ep) single tow specimens. A novel trend seeking statistical method to determine the onset of significant AE was developed, resulting in more linear decreases in the Felicity ratio (FR) with load, potentially leading to more accurate failure prediction. The method developed uses an exponentially weighted moving average (EWMA) control chart. Comparison of the EWMA with previously used FR onset methods, namely the discrete (n), mean (n (raised bar)), normalized (n%) and normalized mean (n(raised bar)%) methods, revealed the EWMA method yields more consistently linear FR versus load relationships between specimens. Other findings include a correlation between AE data richness and FR linearity based on the FR methods discussed in this paper, and evidence of premature failure at lower than expected loads. Application of the EWMA method should be extended to other composite materials and, eventually, composite components such as composite overwrapped pressure vessels. Furthermore, future experiments should attempt to uncover the factors responsible for infant mortality in C/Ep strands.

  3. Reducing error and improving efficiency during vascular interventional radiology: implementation of a preprocedural team rehearsal.

    PubMed

    Morbi, Abigail H M; Hamady, Mohamad S; Riga, Celia V; Kashef, Elika; Pearch, Ben J; Vincent, Charles; Moorthy, Krishna; Vats, Amit; Cheshire, Nicholas J W; Bicknell, Colin D

    2012-08-01

    To determine the type and frequency of errors during vascular interventional radiology (VIR) and design and implement an intervention to reduce error and improve efficiency in this setting. Ethical guidance was sought from the Research Services Department at Imperial College London. Informed consent was not obtained. Field notes were recorded during 55 VIR procedures by a single observer. Two blinded assessors identified failures from field notes and categorized them into one or more errors by using a 22-part classification system. The potential to cause harm, disruption to procedural flow, and preventability of each failure was determined. A preprocedural team rehearsal (PPTR) was then designed and implemented to target frequent preventable potential failures. Thirty-three procedures were observed subsequently to determine the efficacy of the PPTR. Nonparametric statistical analysis was used to determine the effect of intervention on potential failure rates, potential to cause harm and procedural flow disruption scores (Mann-Whitney U test), and number of preventable failures (Fisher exact test). Before intervention, 1197 potential failures were recorded, of which 54.6% were preventable. A total of 2040 errors were deemed to have occurred to produce these failures. Planning error (19.7%), staff absence (16.2%), equipment unavailability (12.2%), communication error (11.2%), and lack of safety consciousness (6.1%) were the most frequent errors, accounting for 65.4% of the total. After intervention, 352 potential failures were recorded. Classification resulted in 477 errors. Preventable failures decreased from 54.6% to 27.3% (P < .001) with implementation of PPTR. Potential failure rates per hour decreased from 18.8 to 9.2 (P < .001), with no increase in potential to cause harm or procedural flow disruption per failure. Failures during VIR procedures are largely because of ineffective planning, communication error, and equipment difficulties, rather than a result of technical or patient-related issues. Many of these potential failures are preventable. A PPTR is an effective means of targeting frequent preventable failures, reducing procedural delays and improving patient safety.

  4. Detection of Failure in Asynchronous Motor Using Soft Computing Method

    NASA Astrophysics Data System (ADS)

    Vinoth Kumar, K.; Sony, Kevin; Achenkunju John, Alan; Kuriakose, Anto; John, Ano P.

    2018-04-01

    This paper investigates the stator short winding failure of asynchronous motor also their effects on motor current spectrums. A fuzzy logic approach i.e., model based technique possibly will help to detect the asynchronous motor failure. Actually, fuzzy logic similar to humanoid intelligent methods besides expected linguistic empowering inferences through vague statistics. The dynamic model is technologically advanced for asynchronous motor by means of fuzzy logic classifier towards investigate the stator inter turn failure in addition open phase failure. A hardware implementation was carried out with LabVIEW for the online-monitoring of faults.

  5. The Impact of Preradiation Residual Disease Volume on Time to Locoregional Failure in Cutaneous Merkel Cell Carcinoma—A TROG Substudy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finnigan, Renee; Hruby, George; Wratten, Chris

    2013-05-01

    Purpose: This study evaluated the impact of margin status and gross residual disease in patients treated with chemoradiation therapy for high-risk stage I and II Merkel cell cancer (MCC). Methods and Materials: Data were pooled from 3 prospective trials in which patients were treated with 50 Gy in 25 fractions to the primary lesion and draining lymph nodes and 2 schedules of carboplatin based chemotherapy. Time to locoregional failure was analyzed according to the burden of disease at the time of radiation therapy, comparing patients with negative margins, involved margins, or macroscopic disease. Results: Analysis was performed on 88 patients,more » of whom 9 had microscopically positive resection margins and 26 had macroscopic residual disease. The majority of gross disease was confined to nodal regions. The 5-year time to locoregional failure, time to distant failure, time to progression, and disease-specific survival rates for the whole group were 73%, 69%, 62%, and 66% respectively. The hazard ratio for macroscopic disease at the primary site or the nodes was 1.25 (95% confidence interval 0.57-2.77), P=.58. Conclusions: No statistically significant differences in time to locoregional failure were identified between patients with negative margins and those with microscopic or gross residual disease. These results must, however, be interpreted with caution because of the limited sample size.« less

  6. Pisces did not have increased heart failure: data-driven comparisons of binary proportions between levels of a categorical variable can result in incorrect statistical significance levels.

    PubMed

    Austin, Peter C; Goldwasser, Meredith A

    2008-03-01

    We examined the impact on statistical inference when a chi(2) test is used to compare the proportion of successes in the level of a categorical variable that has the highest observed proportion of successes with the proportion of successes in all other levels of the categorical variable combined. Monte Carlo simulations and a case study examining the association between astrological sign and hospitalization for heart failure. A standard chi(2) test results in an inflation of the type I error rate, with the type I error rate increasing as the number of levels of the categorical variable increases. Using a standard chi(2) test, the hospitalization rate for Pisces was statistically significantly different from that of the other 11 astrological signs combined (P=0.026). After accounting for the fact that the selection of Pisces was based on it having the highest observed proportion of heart failure hospitalizations, subjects born under the sign of Pisces no longer had a significantly higher rate of heart failure hospitalization compared to the other residents of Ontario (P=0.152). Post hoc comparisons of the proportions of successes across different levels of a categorical variable can result in incorrect inferences.

  7. Interpretation of correlations in clinical research.

    PubMed

    Hung, Man; Bounsanga, Jerry; Voss, Maren Wright

    2017-11-01

    Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.

  8. Subscale Test Methods for Combustion Devices

    NASA Technical Reports Server (NTRS)

    Anderson, W. E.; Sisco, J. C.; Long, M. R.; Sung, I.-K.

    2005-01-01

    Stated goals for long-life LRE s have been between 100 and 500 cycles: 1) Inherent technical difficulty of accurately defining the transient and steady state thermochemical environments and structural response (strain); 2) Limited statistical basis on failure mechanisms and effects of design and operational variability; and 3) Very high test costs and budget-driven need to protect test hardware (aversion to test-to-failure). Ambitious goals will require development of new databases: a) Advanced materials, e.g., tailored composites with virtually unlimited property variations; b) Innovative functional designs to exploit full capabilities of advanced materials; and c) Different cycles/operations. Subscale testing is one way to address technical and budget challenges: 1) Prototype subscale combustors exposed to controlled simulated conditions; 2) Complementary to conventional laboratory specimen database development; 3) Instrumented with sensors to measure thermostructural response; and 4) Coupled with analysis

  9. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  10. Analysis and design of randomised clinical trials involving competing risks endpoints.

    PubMed

    Tai, Bee-Choo; Wee, Joseph; Machin, David

    2011-05-19

    In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.

  11. Retrospective Analysis of Outcome Differences in Preoperative Concurrent Chemoradiation With or Without Elective Nodal Irradiation for Esophageal Squamous Cell Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, Feng-Ming; Cancer Research Center, National Taiwan University College of Medicine, Taipei, Taiwan; Graduate Institute of Biomedical Electronics and Bioinformatics, National Taiwan University, Taipei, Taiwan

    2011-11-15

    Purpose: To evaluate the efficacy and patterns of failure of elective nodal irradiation (ENI) in patients with esophageal squamous cell carcinoma (SCC) undergoing preoperative concurrent chemoradiation (CCRT) followed by radical surgery. Methods and Materials: We retrospectively studied 118 patients with AJCC Stage II to III esophageal SCC undergoing preoperative CCRT (median, 36 Gy), followed by radical esophagectomy. Of them, 73 patients (62%) had ENI and 45 patients (38%) had no ENI. Patients with ENI received radiotherapy to either supraclavicular (n = 54) or celiac (n = 19) lymphatics. Fifty-six patients (57%) received chemotherapy with paclitaxel plus cisplatin. The 3-year progression-freemore » survival, overall survival, and patterns of failure were analyzed. Distant nodal recurrence was classified into M1a and M1b regions. A separate analysis using matched cases was conducted. Results: The median follow-up was 38 months. There were no differences in pathological complete response rate (p = 0.12), perioperative mortality rate (p = 0.48), or delayed Grade 3 or greater cardiopulmonary toxicities (p = 0.44), between the groups. More patients in the non-ENI group had M1a failure than in the ENI group, with 3-year rates of 11% and 3%, respectively (p = 0.05). However, the 3-year isolated distant nodal (M1a + M1b) failure rates were not different (ENI, 10%; non-ENI, 14%; p = 0.29). In multivariate analysis, pathological nodal status was the only independent prognostic factor associated with overall survival (hazard ratio = 1.78, p = 0.045). The 3-year overall survival and progression-free survival were 45% and 45%, respectively, in the ENI group, and 52% and 43%, respectively, in the non-ENI group (p = 0.31 and 0.89, respectively). Matched cases analysis did not show a statistical difference in outcomes between the groups. Conclusions: ENI reduced the M1a failure rate but was not associated with improved outcomes in patients undergoing preoperative CCRT for esophageal SCC. Pathological nodal metastasis predicted poor outcome.« less

  12. Free versus perforator-pedicled propeller flaps in lower extremity reconstruction: What is the safest coverage? A meta-analysis.

    PubMed

    Bekara, Farid; Herlin, Christian; Somda, Serge; de Runz, Antoine; Grolleau, Jean Louis; Chaput, Benoit

    2018-01-01

    Currently, increasingly reconstructive surgeon consider the failure rates of perforator propeller flaps especially in the distal third of the lower leg are too important and prefer to return to the use of free flap at first line option with failure rates frequently lower than 5%. So, we performed a systematic review with meta-analysis comparing free flaps (perforator-based or not) and pedicled-propeller flaps to respond to the question "what is the safest coverage for distal third of the lower limb?" This review was conducted according to PRISMA criteria. From 1991 to 2015, MEDLINE®, Pubmed central, Embase and Cochrane Library were searched. The pooled estimations were performed by meta-analysis. The homogeneity Q statistic and the I 2 index were computed. We included 36 articles for free flaps (1,226 flaps) and 19 articles for pedicled-propeller flaps (302 flaps). The overall failure rate was 3.9% [95%CI:2.6-5.3] for free flaps and 2.77% [95%CI:0.0-5.6] for pedicled-propeller flaps (P = 0.36). The complication rates were 19.0% for free flaps and 21.4% for pedicled-propeller flaps (P = 0.37). In more detail, we noted for free flaps versus pedicled-propeller flaps: partial necrosis (2.70 vs. 6.88%, P = 0.001%), wound dehiscence (2.38 vs. 0.26%, P = 0.018), infection (4.45 vs. 1.22%, P = 0.009). The coverage failure rate was 5.24% [95%CI:3.68-6.81] versus 2.99% [95%CI:0.38-5.60] without significant difference (P = 0.016). In the lower limb the complications are not rare and many teams consider the free flaps to be safer. In this meta-analysis we provide evidence that failure and overall complications rate of perforator propeller flaps are comparable with free flaps. Although, partial necrosis is significantly higher for pedicled-propeller flaps than free flaps, in reality the success of coverage appears similar. © 2016 Wiley Periodicals, Inc. Microsurgery, 38:109-119, 2018. © 2016 Wiley Periodicals, Inc.

  13. Retrospective analysis of outcome differences in preoperative concurrent chemoradiation with or without elective nodal irradiation for esophageal squamous cell carcinoma.

    PubMed

    Hsu, Feng-Ming; Lee, Jang-Ming; Huang, Pei-Ming; Lin, Chia-Chi; Hsu, Chih-Hung; Tsai, Yu-Chieh; Lee, Yung-Chie; Chia-Hsien Cheng, Jason

    2011-11-15

    To evaluate the efficacy and patterns of failure of elective nodal irradiation (ENI) in patients with esophageal squamous cell carcinoma (SCC) undergoing preoperative concurrent chemoradiation (CCRT) followed by radical surgery. We retrospectively studied 118 patients with AJCC Stage II to III esophageal SCC undergoing preoperative CCRT (median, 36 Gy), followed by radical esophagectomy. Of them, 73 patients (62%) had ENI and 45 patients (38%) had no ENI. Patients with ENI received radiotherapy to either supraclavicular (n = 54) or celiac (n = 19) lymphatics. Fifty-six patients (57%) received chemotherapy with paclitaxel plus cisplatin. The 3-year progression-free survival, overall survival, and patterns of failure were analyzed. Distant nodal recurrence was classified into M1a and M1b regions. A separate analysis using matched cases was conducted. The median follow-up was 38 months. There were no differences in pathological complete response rate (p = 0.12), perioperative mortality rate (p = 0.48), or delayed Grade 3 or greater cardiopulmonary toxicities (p = 0.44), between the groups. More patients in the non-ENI group had M1a failure than in the ENI group, with 3-year rates of 11% and 3%, respectively (p = 0.05). However, the 3-year isolated distant nodal (M1a + M1b) failure rates were not different (ENI, 10%; non-ENI, 14%; p = 0.29). In multivariate analysis, pathological nodal status was the only independent prognostic factor associated with overall survival (hazard ratio = 1.78, p = 0.045). The 3-year overall survival and progression-free survival were 45% and 45%, respectively, in the ENI group, and 52% and 43%, respectively, in the non-ENI group (p = 0.31 and 0.89, respectively). Matched cases analysis did not show a statistical difference in outcomes between the groups. ENI reduced the M1a failure rate but was not associated with improved outcomes in patients undergoing preoperative CCRT for esophageal SCC. Pathological nodal metastasis predicted poor outcome. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Baseline Hemodynamics and Response to Contrast Media During Diagnostic Cardiac Catheterization Predict Adverse Events in Heart Failure Patients.

    PubMed

    Denardo, Scott J; Vock, David M; Schmalfuss, Carsten M; Young, Gregory D; Tcheng, James E; O'Connor, Christopher M

    2016-07-01

    Contrast media administered during cardiac catheterization can affect hemodynamic variables. However, little is documented about the effects of contrast on hemodynamics in heart failure patients or the prognostic value of baseline and changes in hemodynamics for predicting subsequent adverse events. In this prospective study of 150 heart failure patients, we measured hemodynamics at baseline and after administration of iodixanol or iopamidol contrast. One-year Kaplan-Meier estimates of adverse event-free survival (death, heart failure hospitalization, and rehospitalization) were generated, grouping patients by baseline measures of pulmonary capillary wedge pressure (PCWP) and cardiac index (CI), and by changes in those measures after contrast administration. We used Cox proportional hazards modeling to assess sequentially adding baseline PCWP and change in CI to 5 validated risk models (Seattle Heart Failure Score, ESCAPE [Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness], CHARM [Candesartan in Heart Failure: Assessment of Reduction in Mortality and Morbidity], CORONA [Controlled Rosuvastatin Multinational Trial in Heart Failure], and MAGGIC [Meta-Analysis Global Group in Chronic Heart Failure]). Median contrast volume was 109 mL. Both contrast media caused similarly small but statistically significant changes in most hemodynamic variables. There were 39 adverse events (26.0%). Adverse event rates increased using the composite metric of baseline PCWP and change in CI (P<0.01); elevated baseline PCWP and decreased CI after contrast correlated with the poorest prognosis. Adding both baseline PCWP and change in CI to the 5 risk models universally improved their predictive value (P≤0.02). In heart failure patients, the administration of contrast causes small but significant changes in hemodynamics. Calculating baseline PCWP with change in CI after contrast predicts adverse events and increases the predictive value of existing models. Patients with elevated baseline PCWP and decreased CI after contrast merit greatest concern. © 2016 American Heart Association, Inc.

  15. Single, double or multiple-injection techniques for non-ultrasound guided axillary brachial plexus block in adults undergoing surgery of the lower arm.

    PubMed

    Chin, Ki Jinn; Alakkad, Husni; Cubillos, Javier E

    2013-08-08

    Regional anaesthesia comprising axillary block of the brachial plexus is a common anaesthetic technique for distal upper limb surgery. This is an update of a review first published in 2006 and updated in 2011. To compare the relative effects (benefits and harms) of three injection techniques (single, double and multiple) of axillary block of the brachial plexus for distal upper extremity surgery. We considered these effects primarily in terms of anaesthetic effectiveness; the complication rate (neurological and vascular); and pain and discomfort caused by performance of the block. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE, EMBASE and reference lists of trials. We contacted trial authors. The date of the last search was March 2013 (updated from March 2011). We included randomized controlled trials that compared double with single-injection techniques, multiple with single-injection techniques, or multiple with double-injection techniques for axillary block in adults undergoing surgery of the distal upper limb. We excluded trials using ultrasound-guided techniques. Independent study selection, risk of bias assessment and data extraction were performed by at least two investigators. We undertook meta-analysis. The 21 included trials involved a total of 2148 participants who received regional anaesthesia for hand, wrist, forearm or elbow surgery. Risk of bias assessment indicated that trial design and conduct were generally adequate; the most common areas of weakness were in blinding and allocation concealment.Eight trials comparing double versus single injections showed a statistically significant decrease in primary anaesthesia failure (risk ratio (RR 0.51), 95% confidence interval (CI) 0.30 to 0.85). Subgroup analysis by method of nerve location showed that the effect size was greater when neurostimulation was used rather than the transarterial technique.Eight trials comparing multiple with single injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.25, 95% CI 0.14 to 0.44) and of incomplete motor block (RR 0.61, 95% CI 0.39 to 0.96) in the multiple injection group.Eleven trials comparing multiple with double injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.28, 95% CI 0.20 to 0.40) and of incomplete motor block (RR 0.55, 95% CI 0.36 to 0.85) in the multiple injection group.Tourniquet pain was significantly reduced with multiple injections compared with double injections (RR 0.53, 95% CI 0.33 to 0.84). Otherwise there were no statistically significant differences between groups in any of the three comparisons on secondary analgesia failure, complications and patient discomfort. The time for block performance was significantly shorter for single and double injections compared with multiple injections. This review provides evidence that multiple-injection techniques using nerve stimulation for axillary plexus block produce more effective anaesthesia than either double or single-injection techniques. However, there was insufficient evidence for a significant difference in other outcomes, including safety.

  16. Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults

    NASA Technical Reports Server (NTRS)

    Hamill, Maggie; Goseva-Popstojanova, Katerina

    2016-01-01

    Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.

  17. The Studies on the Gastrin Levels in the Patients with Renal Failure

    PubMed Central

    Kim, Myung Hwan; Kim, Han Su; Rim, Kyu Sung; Bang, Ik Soo; Kim, Myung Jae; Chang, Rin; Min, Young II

    1986-01-01

    Fasting and postprandial gastrin levels were measured by radioimmunoassay in serum from 15 patients with renal failure and compared with those in 15 healthy controls. Pre- and posthemodialysis gastrin levels were also measured. The fasting serum gastrin levels and serum gastrin response to a standard meal in the patients with renal failure were significantly higher than those in normal controls. Fasting and meal stimulated gastrin levels were not significantly different in renal failure patients with peptic ulcer when compared with those in renal failure patients without peptic ulcer. There were no statistically significant differences in the serum gastrin levels before and after hemodialysis in patients with renal failure. PMID:15759375

  18. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  19. A Review of Arteriovenous Fistulae Creation in Octogenarians.

    PubMed

    Diandra, Jennifer Clarissa; Lo, Zhiwen Joseph; Ang, Wei-Wen; Feng, Jue Fei; Narayanan, Sriram; Tan, Glenn Wei Leong; Chandrasekar, Sadhana

    2018-01-01

    To analyze the outcomes of arteriovenous fistulae (AVFs) creation in octogenarians. A retrospective study of 47 AVFs created in patients aged 80 years and above from 2008 to 2014. Patient and AVF characteristics and outcomes were evaluated. Predictors of patency were analyzed with multivariate analysis and Kaplan-Meier patency, and survival analysis was performed. Forty-seven of 1,259 AVFs created were for octogenarians (4%). Mean age was 83 years old (range: 80-91 years), with 27 male (57%) and 35 with tunneled dialysis catheters in situ (75%). There were a total of 15 (32%) radiocephalic AVFs, 30 (64%) brachial-cephalic AVFs, and 2 (4%) brachial-basilic transposition AVFs. At 12 months, assisted primary patency rate was 28% (13 patients) while primary failure rate was 72% (34 patients). Subset analysis showed brachial-cephalic AVFs to have the highest assisted primary patency rate at 33%. Within 24 months, tunneled dialysis catheter-related sepsis rate was 31% (11 patients). Multivariate analysis did not reveal any factor to be statistically significant in predicting AVF patency. Kaplan-Meier survival curve showed a 50% survival rate at 63 months after AVF creation. In view of high AVF primary failure rate and relatively low tunneled dialysis catheter bacteremia rate, long-term tunneled dialysis catheters as the main form of hemodialysis renal access may be a viable option. However, with 50% of end-stage renal failure patients surviving up to 63 months after AVF creation, the risks and benefits of long-term tunneled dialysis catheters must be balanced against those of AVF creation. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Predicting Failure Under Laboratory Conditions: Learning the Physics of Slow Frictional Slip and Dynamic Failure

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, B.; Hulbert, C.; Riviere, J.; Lubbers, N.; Barros, K.; Marone, C.; Johnson, P. A.

    2016-12-01

    Forecasting failure is a primary goal in diverse domains that include earthquake physics, materials science, nondestructive evaluation of materials and other engineering applications. Due to the highly complex physics of material failure and limitations on gathering data in the failure nucleation zone, this goal has often appeared out of reach; however, recent advances in instrumentation sensitivity, instrument density and data analysis show promise toward forecasting failure times. Here, we show that we can predict frictional failure times of both slow and fast stick slip failure events in the laboratory. This advance is made possible by applying a machine learning approach known as Random Forests1(RF) to the continuous acoustic emission (AE) time series recorded by detectors located on the fault blocks. The RF is trained using a large number of statistical features derived from the AE time series signal. The model is then applied to data not previously analyzed. Remarkably, we find that the RF method predicts upcoming failure time far in advance of a stick slip event, based only on a short time window of data. Further, the algorithm accurately predicts the time of the beginning and end of the next slip event. The predicted time improves as failure is approached, as other data features add to prediction. Our results show robust predictions of slow and dynamic failure based on acoustic emissions from the fault zone throughout the laboratory seismic cycle. The predictions are based on previously unidentified tremor-like acoustic signals that occur during stress build up and the onset of macroscopic frictional weakening. We suggest that the tremor-like signals carry information about fault zone processes and allow precise predictions of failure at any time in the slow slip or stick slip cycle2. If the laboratory experiments represent Earth frictional conditions, it could well be that signals are being missed that contain highly useful predictive information. 1Breiman, L. Random forests. Machine Learning 45, 5-32 (2001). 2Rouet-Leduc, B. C. Hulbert, N. Lubbers, K. Barros and P. A. Johnson, Learning the physics of failure, in review (2016).

  1. Elastic and failure response of imperfect three-dimensional metallic lattices: the role of geometric defects induced by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Kamm, Paul; García-Moreno, Francisco; Banhart, John; Pasini, Damiano

    2017-10-01

    This paper examines three-dimensional metallic lattices with regular octet and rhombicuboctahedron units fabricated with geometric imperfections via Selective Laser Sintering. We use X-ray computed tomography to capture morphology, location, and distribution of process-induced defects with the aim of studying their role in the elastic response, damage initiation, and failure evolution under quasi-static compression. Testing results from in-situ compression tomography show that each lattice exhibits a distinct failure mechanism that is governed not only by cell topology but also by geometric defects induced by additive manufacturing. Extracted from X-ray tomography images, the statistical distributions of three sets of defects, namely strut waviness, strut thickness variation, and strut oversizing, are used to develop numerical models of statistically representative lattices with imperfect geometry. Elastic and failure responses are predicted within 10% agreement from the experimental data. In addition, a computational study is presented to shed light into the relationship between the amplitude of selected defects and the reduction of elastic properties compared to their nominal values. The evolution of failure mechanisms is also explained with respect to strut oversizing, a parameter that can critically cause failure mode transitions that are not visible in defect-free lattices.

  2. Monitoring the quality of total hip replacement in a tertiary care department using a cumulative summation statistical method (CUSUM).

    PubMed

    Biau, D J; Meziane, M; Bhumbra, R S; Dumaine, V; Babinet, A; Anract, P

    2011-09-01

    The purpose of this study was to define immediate post-operative 'quality' in total hip replacements and to study prospectively the occurrence of failure based on these definitions of quality. The evaluation and assessment of failure were based on ten radiological and clinical criteria. The cumulative summation (CUSUM) test was used to study 200 procedures over a one-year period. Technical criteria defined failure in 17 cases (8.5%), those related to the femoral component in nine (4.5%), the acetabular component in 32 (16%) and those relating to discharge from hospital in five (2.5%). Overall, the procedure was considered to have failed in 57 of the 200 total hip replacements (28.5%). The use of a new design of acetabular component was associated with more failures. For the CUSUM test, the level of adequate performance was set at a rate of failure of 20% and the level of inadequate performance set at a failure rate of 40%; no alarm was raised by the test, indicating that there was no evidence of inadequate performance. The use of a continuous monitoring statistical method is useful to ensure that the quality of total hip replacement is maintained, especially as newer implants are introduced.

  3. Estimation of lifetime distributions on 1550-nm DFB laser diodes using Monte-Carlo statistic computations

    NASA Astrophysics Data System (ADS)

    Deshayes, Yannick; Verdier, Frederic; Bechou, Laurent; Tregon, Bernard; Danto, Yves; Laffitte, Dominique; Goudard, Jean Luc

    2004-09-01

    High performance and high reliability are two of the most important goals driving the penetration of optical transmission into telecommunication systems ranging from 880 nm to 1550 nm. Lifetime prediction defined as the time at which a parameter reaches its maximum acceptable shirt still stays the main result in terms of reliability estimation for a technology. For optoelectronic emissive components, selection tests and life testing are specifically used for reliability evaluation according to Telcordia GR-468 CORE requirements. This approach is based on extrapolation of degradation laws, based on physics of failure and electrical or optical parameters, allowing both strong test time reduction and long-term reliability prediction. Unfortunately, in the case of mature technology, there is a growing complexity to calculate average lifetime and failure rates (FITs) using ageing tests in particular due to extremely low failure rates. For present laser diode technologies, time to failure tend to be 106 hours aged under typical conditions (Popt=10 mW and T=80°C). These ageing tests must be performed on more than 100 components aged during 10000 hours mixing different temperatures and drive current conditions conducting to acceleration factors above 300-400. These conditions are high-cost, time consuming and cannot give a complete distribution of times to failure. A new approach consists in use statistic computations to extrapolate lifetime distribution and failure rates in operating conditions from physical parameters of experimental degradation laws. In this paper, Distributed Feedback single mode laser diodes (DFB-LD) used for 1550 nm telecommunication network working at 2.5 Gbit/s transfer rate are studied. Electrical and optical parameters have been measured before and after ageing tests, performed at constant current, according to Telcordia GR-468 requirements. Cumulative failure rates and lifetime distributions are computed using statistic calculations and equations of drift mechanisms versus time fitted from experimental measurements.

  4. Ultrasound-guided axillary brachial plexus block versus local infiltration anesthesia for arteriovenous fistula creation at the forearm for hemodialysis in patients with chronic renal failure.

    PubMed

    Nofal, W H; El Fawal, S M; Shoukry, A A; Sabek, Eas; Malak, Wfa

    2017-01-01

    The primary failure rate for arteriovenous fistula (AVF) creation under local anesthesia for hemodialysis is about 30%. Axillary brachial plexus block (BPB) may improve blood flow through blood vessels used in fistula creation; it may improve the AVF blood flow and thus may reduce the primary failure rate after 3 months. Hundred and forty patients with chronic renal failure scheduled for AVF creation for hemodialysis were divided into two equal groups; Group 1 (AxBP-G) received ultrasound (US) guided axillary BPB, and Group 2 (LI-G) received local infiltration. We recorded the measurements of the brachial and radial arteries before and after anesthesia and the AVF blood flow in both groups at three different time points. Furthermore, the primary failure rate was recorded in each group and compared. After anesthesia, the mean radial artery blood flow in the AxBP-group was 3.52 ml/min more than the LI-group, and the brachial artery diameter was also 0.68 mm more than in the LI-group, both differences were statistically significant ( P < 0.05). There were significant increases ( P < 0.05) in the AVF blood flow in the AxBP-group more than the LI-group with mean differences of 29.6, 69.8, and 27.2 ml/min at 4 h, 1 week, and 3 months, respectively. The overall mean of AVF blood flow was 42.21 ml/min more in the AxBP group than the LI-group a difference which is statistically significant ( P < 0.001). The primary failure rate was 17% in the AxBP group versus 30% in the LI-group; however, this difference is not significant statistically ( P = 0.110). The US-guided axillary block increases AVF blood flow significantly more than local infiltration and nonsignificantly decreases the primary failure rate of the AVF after 3 months.

  5. Evaluation of shear bond strength of porcelain bonded to laser welded titanium surface and determination of mode of bond failure.

    PubMed

    Patil, Narendra P; Dandekar, Minal; Nadiger, Ramesh K; Guttal, Satyabodh S

    2010-09-01

    The aim of this study was to evaluate the shear bond strength of porcelain to laser welded titanium surface and to determine the mode of bond failure through scanning electron microscopy (SEM) and energy dispersive spectrophotometry (EDS). Forty five cast rectangular titanium specimens with the dimension of 10 mm x 8 mm x 1 mm were tested. Thirty specimens had a perforation of 2 mm diameter in the centre. These were randomly divided into Group A and B. The perforations in the Group B specimens were repaired by laser welding using Cp Grade II titanium wire. The remaining 15 specimens were taken as control group. All the test specimens were layered with low fusing porcelain and tested for shear bond strength. The debonded specimens were subjected to SEM and EDS. Data were analysed with 1-way analysis of variance and Student's t-test for comparison among the different groups. One-way analysis of variance (ANOVA) showed no statistically significant difference in shear bond strength values at a 5% level of confidence. The mean shear bond strength values for control group, Group A and B was 8.4 +/- 0.5 Mpa, 8.1 +/- 0.4 Mpa and 8.3 +/- 0.3 Mpa respectively. SEM/EDS analysis of the specimens showed mixed and cohesive type of bond failure. Within the limitations of the study laser welding did not have any effect on the shear bond strength of porcelain bonded to titanium.

  6. Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

    1997-01-01

    A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

  7. Diagnostics of psychophysiological states and motivation in elite athletes.

    PubMed

    Korobeynikov, G; Mazmanian, K; Korobeynikova, L; Jagiello, W

    2011-01-01

    Concepts explored in our study concerned identification of various types of motivation and their connection to psychophysiological states in elite judo and Greco-Roman wrestlers. We tried to figure out how do these different types of motivation interact to describe psychophysiological state in qualified wrestlers. Neuropsychological evaluation methods as simple (SRT) and choice reaction-time (CRT) tests, HRV measurements, psychological questionnaires. To explore obtained data methods of statistical analysis were used Obtained data show that different combinations of levels of motivation to achieve success and motivation to avoid failure provoke different psychophysiological states. Conducted experiment revealed that combination of high levels of both motivation to achievement of success and motivation to avoid failure provides better psychophysiological state in elite wrestlers compared to other groups with different combinations of motivational variables. Conducted experiment revealed that motivation to avoid failures had been formed as a personality formation, which compensates excessive tension, caused by high level of motivation to achieve and regulate the psychophysiological state. This can be viewed as an effect of training in athletes (Tab. 3, Fig. 1, Ref. 38).

  8. Predictors of exercise capacity following exercise-based rehabilitation in patients with coronary heart disease and heart failure: A meta-regression analysis.

    PubMed

    Uddin, Jamal; Zwisler, Ann-Dorthe; Lewinter, Christian; Moniruzzaman, Mohammad; Lund, Ken; Tang, Lars H; Taylor, Rod S

    2016-05-01

    The aim of this study was to undertake a comprehensive assessment of the patient, intervention and trial-level factors that may predict exercise capacity following exercise-based rehabilitation in patients with coronary heart disease and heart failure. Meta-analysis and meta-regression analysis. Randomized controlled trials of exercise-based rehabilitation were identified from three published systematic reviews. Exercise capacity was pooled across trials using random effects meta-analysis, and meta-regression used to examine the association between exercise capacity and a range of patient (e.g. age), intervention (e.g. exercise frequency) and trial (e.g. risk of bias) factors. 55 trials (61 exercise-control comparisons, 7553 patients) were included. Following exercise-based rehabilitation compared to control, overall exercise capacity was on average 0.95 (95% CI: 0.76-1.41) standard deviation units higher, and in trials reporting maximum oxygen uptake (VO2max) was 3.3 ml/kg.min(-1) (95% CI: 2.6-4.0) higher. There was evidence of a high level of statistical heterogeneity across trials (I(2) statistic > 50%). In multivariable meta-regression analysis, only exercise intervention intensity was found to be significantly associated with VO2max (P = 0.04); those trials with the highest average exercise intensity had the largest mean post-rehabilitation VO2max compared to control. We found considerable heterogeneity across randomized controlled trials in the magnitude of improvement in exercise capacity following exercise-based rehabilitation compared to control among patients with coronary heart disease or heart failure. Whilst higher exercise intensities were associated with a greater level of post-rehabilitation exercise capacity, there was no strong evidence to support other intervention, patient or trial factors to be predictive. © The European Society of Cardiology 2015.

  9. The fluoroscopy time, door to balloon time, contrast volume use and prevalence of vascular access site failure with transradial versus transfemoral approach in ST segment elevation myocardial infarction: A systematic review & meta-analysis.

    PubMed

    Singh, Sukhchain; Singh, Mukesh; Grewal, Navsheen; Khosla, Sandeep

    2015-12-01

    The authors aimed to conduct first systematic review and meta-analysis in STEMI patients evaluating vascular access site failure rate, fluoroscopy time, door to balloon time and contrast volume used with transradial vs transfemoral approach (TRA vs TFA) for PCI. The PubMed, CINAHL, clinicaltrials.gov, Embase and CENTRAL databases were searched for randomized trials comparing TRA versus TFA. Random effect models were used to conduct this meta-analysis. Fourteen randomized trials comprising 3758 patients met inclusion criteria. The access site failure rate was significantly higher TRA compared to TFA (RR 3.30, CI 2.16-5.03; P=0.000). Random effect inverse variance weighted prevalence rate meta-analysis showed that access site failure rate was predicted to be 4% (95% CI 3.0-6.0%) with TRA versus 1% (95% CI 0.0-1.0 %) with TFA. Door to balloon time (Standardized mean difference [SMD] 0.30 min, 95% CI 0.23-0.37 min; P=0.000) and fluoroscopy time (Standardized mean difference 0.14 min, 95% CI 0.06-0.23 min; P=0.001) were also significantly higher in TRA. There was no difference in the amount of contrast volume used with TRA versus TFA (SMD -0.05 ml, 95% CI -0.14 to 0.04 ml; P=0.275). Statistical heterogeneity was low in cross-over rate and contrast volume use, moderate in fluoroscopy time but high in the door to balloon time comparison. Operators need to consider higher cross-over rate with TRA compared to TFA in STEMI patients while attempting PCI. Fluoroscopy and door to balloon times are negligibly higher with TRA but there is no difference in terms of contrast volume use. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  11. Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    2008-01-01

    Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.

  12. Electromigration model for the prediction of lifetime based on the failure unit statistics in aluminum metallization

    NASA Astrophysics Data System (ADS)

    Park, Jong Ho; Ahn, Byung Tae

    2003-01-01

    A failure model for electromigration based on the "failure unit model" was presented for the prediction of lifetime in metal lines.The failure unit model, which consists of failure units in parallel and series, can predict both the median time to failure (MTTF) and the deviation in the time to failure (DTTF) in Al metal lines. The model can describe them only qualitatively. In our model, both the probability function of the failure unit in single grain segments and polygrain segments are considered instead of in polygrain segments alone. Based on our model, we calculated MTTF, DTTF, and activation energy for different median grain sizes, grain size distributions, linewidths, line lengths, current densities, and temperatures. Comparisons between our results and published experimental data showed good agreements and our model could explain the previously unexplained phenomena. Our advanced failure unit model might be further applied to other electromigration characteristics of metal lines.

  13. Randomized clinical trial of encapsulated and hand-mixed glass-ionomer ART restorations: one-year follow-up.

    PubMed

    Freitas, Maria Cristina Carvalho de Almendra; Fagundes, Ticiane Cestari; Modena, Karin Cristina da Silva; Cardia, Guilherme Saintive; Navarro, Maria Fidela de Lima

    2018-01-18

    This prospective, randomized, split-mouth clinical trial evaluated the clinical performance of conventional glass ionomer cement (GIC; Riva Self-Cure, SDI), supplied in capsules or in powder/liquid kits and placed in Class I cavities in permanent molars by the Atraumatic Restorative Treatment (ART) approach. A total of 80 restorations were randomly placed in 40 patients aged 11-15 years. Each patient received one restoration with each type of GIC. The restorations were evaluated after periods of 15 days (baseline), 6 months, and 1 year, according to ART criteria. Wilcoxon matched pairs, multivariate logistic regression, and Gehan-Wilcoxon tests were used for statistical analysis. Patients were evaluated after 15 days (n=40), 6 months (n=34), and 1 year (n=29). Encapsulated GICs showed significantly superior clinical performance compared with hand-mixed GICs at baseline (p=0.017), 6 months (p=0.001), and 1 year (p=0.026). For hand-mixed GIC, a statistically significant difference was only observed over the period of baseline to 1 year (p=0.001). Encapsulated GIC presented statistically significant differences for the following periods: 6 months to 1 year (p=0.028) and baseline to 1 year (p=0.002). Encapsulated GIC presented superior cumulative survival rate than hand-mixed GIC over one year. Importantly, both GICs exhibited decreased survival over time. Encapsulated GIC promoted better ART performance, with an annual failure rate of 24%; in contrast, hand-mixed GIC demonstrated a failure rate of 42%.

  14. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    PubMed

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  15. Reliability evaluation methodology for NASA applications

    NASA Technical Reports Server (NTRS)

    Taneja, Vidya S.

    1992-01-01

    Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability in general has not been considered as one of the system parameters like cost or performance. Up till now, quantification of reliability has not been a consideration during system design and development in the liquid rocket industry. Engineers and managers have long been aware of the fact that the reliability of the system increases during development, but no serious attempts have been made to quantify reliability. As a result, a method to quantify reliability during design and development is needed. This includes application of probabilistic models which utilize both engineering analysis and test data. Classical methods require the use of operating data for reliability demonstration. In contrast, the method described in this paper is based on similarity, analysis, and testing combined with Bayesian statistical analysis.

  16. Characterization of the Failure Site Distribution in MIM Devices Using Zoomed Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Muñoz-Gorriz, J.; Monaghan, S.; Cherkaoui, K.; Suñé, J.; Hurley, P. K.; Miranda, E.

    2018-05-01

    The angular wavelet analysis is applied to the study of the spatial distribution of breakdown (BD) spots in Pt/HfO2/Pt capacitors with square and circular areas. The method is originally developed for rectangular areas, so a zoomed approach needs to be considered when the observation window does not coincide with the device area. The BD spots appear as a consequence of the application of electrical stress to the device. The stress generates defects within the dielectric film, a process that ends with the formation of a percolation path between the electrodes and the melting of the top metal layer because of the high release of energy. The BD spots have lateral sizes ranging from 1 μm to 3 μm and they appear as a point pattern that can be studied using spatial statistics methods. In this paper, we report the application of the angular wavelet method as a complementary tool for the analysis of the distribution of failure sites in large-area metal-insulator-metal (MIM) devices. The differences between considering a continuous or a discrete wavelet and the role played by the number of BD spots are also investigated.

  17. Systematic analysis of coding and noncoding DNA sequences using methods of statistical linguistics

    NASA Technical Reports Server (NTRS)

    Mantegna, R. N.; Buldyrev, S. V.; Goldberger, A. L.; Havlin, S.; Peng, C. K.; Simons, M.; Stanley, H. E.

    1995-01-01

    We compare the statistical properties of coding and noncoding regions in eukaryotic and viral DNA sequences by adapting two tests developed for the analysis of natural languages and symbolic sequences. The data set comprises all 30 sequences of length above 50 000 base pairs in GenBank Release No. 81.0, as well as the recently published sequences of C. elegans chromosome III (2.2 Mbp) and yeast chromosome XI (661 Kbp). We find that for the three chromosomes we studied the statistical properties of noncoding regions appear to be closer to those observed in natural languages than those of coding regions. In particular, (i) a n-tuple Zipf analysis of noncoding regions reveals a regime close to power-law behavior while the coding regions show logarithmic behavior over a wide interval, while (ii) an n-gram entropy measurement shows that the noncoding regions have a lower n-gram entropy (and hence a larger "n-gram redundancy") than the coding regions. In contrast to the three chromosomes, we find that for vertebrates such as primates and rodents and for viral DNA, the difference between the statistical properties of coding and noncoding regions is not pronounced and therefore the results of the analyses of the investigated sequences are less conclusive. After noting the intrinsic limitations of the n-gram redundancy analysis, we also briefly discuss the failure of the zeroth- and first-order Markovian models or simple nucleotide repeats to account fully for these "linguistic" features of DNA. Finally, we emphasize that our results by no means prove the existence of a "language" in noncoding DNA.

  18. Predicting duration of mechanical ventilation in patients with carbon monoxide poisoning: a retrospective study.

    PubMed

    Shen, Chih-Hao; Peng, Chung-Kan; Chou, Yu-Ching; Pan, Ke-Ting; Chang, Shun-Cheng; Chang, Shan-Yueh; Huang, Kun-Lun

    2015-02-01

    Patients with severe carbon monoxide (CO) poisoning may develop acute respiratory failure, which needs endotracheal intubation and mechanical ventilation (MV). The objective of this study was to identify the predictors for duration of MV in patients with severe CO poisoning and acute respiratory failure. This is a retrospective observational study of 796 consecutive patients diagnosed with acute CO poisoning that presented to the emergency department. Patients who received MV were divided into 2 groups: the early extubation (EE) consisting of patients who were on MV for less than 72 hours and the nonearly extubation (NEE) consisting of patients who were on MV for more than 72 hours. Demographic and clinical data of the two groups were extracted for analysis. The intubation rate of all CO-poisoned patients was 23.4%. A total of 168 patients were enrolled in this study. The main source of CO exposure was intentional CO poisoning by charcoal burning (137 patients). Positive toxicology screening result was found in 104 patients (61.9%). The EE group had 105 patients (62.5%). On arriving at the emergency department, high incidence of hypotension; high white blood cell count; and elevation of blood urea nitrogen, creatinine, aspartate aminotransferase, alanine aminotransferase, creatine kinase, and troponin-I levels were statistically significant in the NEE group (P < .05). Positive toxicology screening result was statistically significant in the EE group (P < .05). In a multivariate analysis, elevation of troponin-I level was an independent factor for NEE (odds ratio, 1.305; 95% confidence interval, 1.024-1.663; P = .032). Positive toxicology screening result was an independent factor for EE (odds ratio, 0.222; 95% confidence interval, 0.101-0.489; P = .001). A positive toxin screen predicts extubation within the first 72 hours for patients with severe CO poisoning and acute respiratory failure. On the other hand, elevation of initial troponin-I level is a predictor for a longer duration of MV. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Fatigue behavior of resin-modified monolithic CAD-CAM RNC crowns and endocrowns.

    PubMed

    Rocca, G T; Sedlakova, P; Saratti, C M; Sedlacek, R; Gregor, L; Rizcalla, N; Feilzer, A J; Krejci, I

    2016-12-01

    To evaluate the influence of different types of modifications with resin on fatigue resistance and failure behavior of CAD-CAM resin nano ceramic (RNC) restorations for maxillary first premolars. Sixty standardized resin composite root dies received CAD-CAM RNC endocrowns (n=30) and crowns (n=30) (Lava Ultimate, 3M Espe). Restorations were divided into six groups: full anatomic endocrowns (group A) and crowns (group D), buccal resin veneered endocrowns (group B) and crowns (group E) and buccal resin veneered endocrowns (group C) and crowns (group F) with a central groove resin filling. A nano-hybrid resin composite was used to veneer the restorations (Filtek Supreme, 3M Espe). All specimens were first submitted to thermo-mechanical cyclic loading (1.7Hz, 49N, 600000 cycles, 1500 thermo-cycles) and then submitted to cyclic isometric stepwise loading (5Hz) until completion of 105000 cycles or failure after 5000 cycles at 200N, followed by 20000 cycles at 400N, 600N, 800N, 1000N and 1200N. In case of fracture, fragments were analyzed using SEM and modes of failure were determined. Results were statistically analyzed by Kaplan-Meier life survival analysis and log rank test (p=0.05). The differences in survival between groups were not statistically significant, except between groups D and F (p=0.039). Endocrowns fractured predominantly with a mesio-distal wedge-opening fracture (82%). Partial cusp fractures were observed above all in crowns (70%). Analysis of the fractured specimens revealed that the origin of the fracture was mainly at the occlusal contact points of the stepwise loading. Veneering of CAD-CAM RNC restorations has no influence on their fatigue resistance except when monolithic crowns are modified on their occlusal central groove. Copyright © 2016 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  20. Bayes Analysis and Reliability Implications of Stress-Rupture Testing a Kevlar/Epoxy COPV Using Temperature and Pressure Acceleration

    NASA Technical Reports Server (NTRS)

    Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.

    2009-01-01

    Composite Overwrapped Pressure Vessels (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Flight certification is dependent on the reliability analysis to quantify the risk of stress rupture failure in existing flight vessels. Full certification of this reliability model would require a statistically significant number of lifetime tests to be performed and is impractical given the cost and limited flight hardware for certification testing purposes. One approach to confirm the reliability model is to perform a stress rupture test on a flight COPV. Currently, testing of such a Kevlar49 (Dupont)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the database and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio model is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one "nine," that is, reducing the predicted probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several vessels would be necessary.

  1. Iterative Assessment of Statistically-Oriented and Standard Algorithms for Determining Muscle Onset with Intramuscular Electromyography.

    PubMed

    Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A

    2017-12-01

    The onset of muscle activity, as measured by electromyography (EMG), is a commonly applied metric in biomechanics. Intramuscular EMG is often used to examine deep musculature and there are currently no studies examining the effectiveness of algorithms for intramuscular EMG onset. The present study examines standard surface EMG onset algorithms (linear envelope, Teager-Kaiser Energy Operator, and sample entropy) and novel algorithms (time series mean-variance analysis, sequential/batch processing with parametric and nonparametric methods, and Bayesian changepoint analysis). Thirteen male and 5 female subjects had intramuscular EMG collected during isolated biceps brachii and vastus lateralis contractions, resulting in 103 trials. EMG onset was visually determined twice by 3 blinded reviewers. Since the reliability of visual onset was high (ICC (1,1) : 0.92), the mean of the 6 visual assessments was contrasted with the algorithmic approaches. Poorly performing algorithms were stepwise eliminated via (1) root mean square error analysis, (2) algorithm failure to identify onset/premature onset, (3) linear regression analysis, and (4) Bland-Altman plots. The top performing algorithms were all based on Bayesian changepoint analysis of rectified EMG and were statistically indistinguishable from visual analysis. Bayesian changepoint analysis has the potential to produce more reliable, accurate, and objective intramuscular EMG onset results than standard methodologies.

  2. Relative strength of tailor's bunion osteotomies and fixation techniques.

    PubMed

    Haddon, Todd B; LaPointe, Stephan J

    2013-01-01

    A paucity of data is available on the mechanical strength of fifth metatarsal osteotomies. The present study was designed to provide that information. Five osteotomies were mechanically tested to failure using a materials testing machine and compared with an intact fifth metatarsal using a hollow saw bone model with a sample size of 10 for each construct. The osteotomies tested were the distal reverse chevron fixated with a Kirschner wire, the long plantar reverse chevron osteotomy fixated with 2 screws, a mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, the mid-diaphyseal sagittal plane osteotomy fixated with 2 screws, and an additional cerclage wire and a transverse closing wedge osteotomy fixated with a box wire technique. Analysis of variance was performed, resulting in a statistically significant difference among the data at p <.0001. The Tukey-Kramer honestly significant difference with least significant differences was performed post hoc to separate out the pairs at a minimum α of 0.05. The chevron was statistically the strongest construct at 130 N, followed by the long plantar osteotomy at 78 N. The chevron compared well with the control at 114 N, and they both fractured at the proximal model to fixture interface. The other osteotomies were statistically and significantly weaker than both the chevron and the long plantar constructs, with no statistically significant difference among them at 36, 39, and 48 N. In conclusion, the chevron osteotomy was superior in strength to the sagittal and transverse plane osteotomies and similar in strength and failure to the intact model. Copyright © 2013 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  3. Assessing the independent contribution of maternal educational expectations to children's educational attainment in early adulthood: a propensity score matching analysis.

    PubMed

    Pingault, Jean Baptiste; Côté, Sylvana M; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E

    2015-01-01

    Parental educational expectations have been associated with children's educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation—measuring educational attainment—was determined through the Quebec Ministry of Education when the participants were aged 22-23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs.

  4. Bonding of contemporary glass ionomer cements to dentin.

    PubMed

    Yip, H K; Tay, F R; Ngo, H C; Smales, R J; Pashley, D H

    2001-09-01

    The objective of this study was to investigate the microtensile bond strength (microTBS) of contemporary glass ionomer cements (GIC) to sound coronal dentin. Three specimen teeth were prepared for each material tested: Fuji IX GP (GC), ChemFlex (Dentsply) and Ketac-Molar Aplicap (ESPE). GIC buildups were made according to the manufacturers' instructions. After being stored at 37 degrees C, 100% humidity for 24h, the teeth were vertically sectioned into 1x1mm beams for microTBS evaluation. Representative fractured beams were prepared for scanning (SEM) and transmission electron microscopic (TEM) examination. Results of the microTBS test were: Fuji IX GP (12.4+/-8.6MPa), ChemFlex (15.0+/-9.3MPa) and Ketac-Molar Aplicap (11.4+/-7.7MPa). One-way ANOVA and a multiple comparison test showed that ChemFlex had a statistically higher microTBS (p<0.05). SEM fractographic analysis showed that the predominant failure modes were interfacial and mixed failures. The GIC side of the fractured beams revealed dehydration cracks, a high level of porosity, and voids with an eggshell-like crust. TEM analysis of the demineralized dentin sides of the fractured beams revealed the presence of an intermediate layer along the GIC-dentin interface. This zone was present on the fractured dentin surface in the case of interfacial failure, and beneath GIC remnants in specimens that exhibited a mixed failure mode. The findings suggest that the bonding of GIC to dentin is not weak and that the microTBS values probably represent the weak yield strengths of GICs under tension.

  5. Predictors and Patterns of Local, Regional, and Distant Failure in Squamous Cell Carcinoma of the Vulva.

    PubMed

    Bogani, Giorgio; Cromi, Antonella; Serati, Maurizio; Uccella, Stefano; Donato, Violante Di; Casarin, Jvan; Naro, Edoardo Di; Ghezzi, Fabio

    2017-06-01

    To identify factors predicting for recurrence in vulvar cancer patients undergoing surgical treatment. We retrospectively evaluated data of consecutive patients with squamous cell vulvar cancer treated between January 1, 1990 and December 31, 2013. Basic descriptive statistics and multivariable analysis were used to design predicting models influencing outcomes. Five-year disease-free survival (DFS) and overall survival (OS) were analyzed using the Cox model. The study included 101 patients affected by vulvar cancer: 64 (63%) stage I, 12 (12%) stage II, 20 (20%) stage III, and 5 (5%) stage IV. After a mean (SD) follow-up of 37.6 (22.1) months, 21 (21%) recurrences occurred. Local, regional, and distant failures were recorded in 14 (14%), 6 (6%), and 3 (3%) patients, respectively. Five-year DFS and OS were 77% and 82%, respectively. At multivariate analysis only stromal invasion >2 mm (hazard ratio: 4.9 [95% confidence interval, 1.17-21.1]; P=0.04) and extracapsular lymph node involvement (hazard ratio: 9.0 (95% confidence interval, 1.17-69.5); P=0.03) correlated with worse DFS, although no factor independently correlated with OS. Looking at factors influencing local and regional failure, we observed that stromal invasion >2 mm was the only factor predicting for local recurrence, whereas lymph node extracapsular involvement predicted for regional recurrence. Stromal invasion >2 mm and lymph node extracapsular spread are the most important factors predicting for local and regional failure, respectively. Studies evaluating the effectiveness of adjuvant treatment in high-risk patients are warranted.

  6. Reliability study of refractory gate gallium arsenide MESFETS

    NASA Technical Reports Server (NTRS)

    Yin, J. C. W.; Portnoy, W. M.

    1981-01-01

    Refractory gate MESFET's were fabricated as an alternative to aluminum gate devices, which have been found to be unreliable as RF power amplifiers. In order to determine the reliability of the new structures, statistics of failure and information about mechanisms of failure in refractory gate MESFET's are given. Test transistors were stressed under conditions of high temperature and forward gate current to enhance failure. Results of work at 150 C and 275 C are reported.

  7. Reliability study of refractory gate gallium arsenide MESFETS

    NASA Astrophysics Data System (ADS)

    Yin, J. C. W.; Portnoy, W. M.

    Refractory gate MESFET's were fabricated as an alternative to aluminum gate devices, which have been found to be unreliable as RF power amplifiers. In order to determine the reliability of the new structures, statistics of failure and information about mechanisms of failure in refractory gate MESFET's are given. Test transistors were stressed under conditions of high temperature and forward gate current to enhance failure. Results of work at 150 C and 275 C are reported.

  8. [Melamine related urinary calculus and acute renal failure in infants].

    PubMed

    Sun, Ning; Shen, Ying; Sun, Qiang; Li, Xu-ran; Jia, Li-qun; Zhang, Gui-ju; Zhang, Wei-ping; Chen, Zhi; Fan, Jian-feng; Jiang, Ye-ping; Feng, Dong-chuan; Zhang, Rui-feng; Zhu, Xiao-yu; Xiao, Hong-zhan

    2008-11-01

    To summarize clinical characteristics, diagnosis and treatment of infants with urinary calculus and acute renal failure developed after being fed with melamine tainted formula milk. Data of infant patients with urinary calculus and acute renal failure due to melamine tainted formula milk admitted to the Beijing Children's Hospital affiliated to the Capital Medical University and the Xuzhou Children's Hospital in 2008 were used to analyze the epidemiological characteristics, clinical manifestations, image features as well as effects of 4 types of therapies. All the 34 infants with urinary calculus were complicated with acute renal failure, their blood urea nitrogen (BUN) was (24.1 +/- 8.2) mmol/L and creatinine (Cr) was (384.2 +/- 201.2) micromol/L. The chemical analysis on the urinary calculus sampled from 14 of the infants showed that the calculus contained melamine and acidum uricum. The time needed for the four types of therapies for returning Cr to normal was (3.5 +/- 1.9) d for cystoscopy group, (2.7 +/- 1.1) d for lithotomy group, (3.8 +/- 2.3) d for dialysis group, and (2.7 +/- 1.6) d for medical treatment group, which had no statistically significant difference (P = 0.508). Renal failure of all the 34 infants was relieved within 1 to 7 days, averaging (3.0 +/- 1.8) d. Melamine tainted formula milk may cause urinary calculus and obstructive acute renal failure. It is suggested that firstly the patients with urinary calculus complicated with acute renal failure should be treated with dialysis or medication to correct electrolyte disturbances, in particular hyperkalemia, and then relieve the obstruction with available medical and surgical methods as soon as possible. It is observed that the short term prognosis is satisfactory.

  9. Solid Lymph Nodes as an Imaging Biomarker for Risk Stratification in Human Papillomavirus-Related Oropharyngeal Squamous Cell Carcinoma.

    PubMed

    Rath, T J; Narayanan, S; Hughes, M A; Ferris, R L; Chiosea, S I; Branstetter, B F

    2017-07-01

    Human papillomavirus-related oropharyngeal squamous cell carcinoma is associated with cystic lymph nodes on CT and has a favorable prognosis. A subset of patients with aggressive disease experience treatment failure. Our aim was to determine whether the extent of cystic lymph node burden on staging CT can serve as an imaging biomarker to predict treatment failure in human papillomavirus-related oropharyngeal squamous cell carcinoma. We identified patients with human papilloma virus-related oropharyngeal squamous cell carcinoma and staging neck CTs. Demographic and clinical variables were recorded. We retrospectively classified the metastatic lymph node burden on CT as cystic or solid and assessed radiologic extracapsular spread. Biopsy, subsequent imaging, or clinical follow-up was the reference standard for treatment failure. The primary end point was disease-free survival. Cox proportional hazard regression analyses of clinical, demographic, and anatomic variables for treatment failure were performed. One hundred eighty-three patients were included with a mean follow-up of 38 months. In univariate analysis, the following variables had a statistically significant association with treatment failure: solid-versus-cystic lymph nodes, clinical T-stage, clinical N-stage, and radiologic evidence of extracapsular spread. The multivariate Cox proportional hazard model resulted in a model that included solid-versus-cystic lymph nodes, T-stage, and radiologic evidence of extracapsular spread as independent predictors of treatment failure. Patients with cystic nodal metastasis at staging had significantly better disease-free survival than patients with solid lymph nodes. In human papilloma virus-related oropharyngeal squamous cell carcinoma, patients with solid lymph node metastases are at higher risk for treatment failure with worse disease-free survival. Solid lymph nodes may serve as an imaging biomarker to tailor individual treatment regimens. © 2017 by American Journal of Neuroradiology.

  10. New medicinal products for chronic heart failure: advances in clinical trial design and efficacy assessment.

    PubMed

    Cowie, Martin R; Filippatos, Gerasimos S; Alonso Garcia, Maria de Los Angeles; Anker, Stefan D; Baczynska, Anna; Bloomfield, Daniel M; Borentain, Maria; Bruins Slot, Karsten; Cronin, Maureen; Doevendans, Pieter A; El-Gazayerly, Amany; Gimpelewicz, Claudio; Honarpour, Narimon; Janmohamed, Salim; Janssen, Heidi; Kim, Albert M; Lautsch, Dominik; Laws, Ian; Lefkowitz, Martin; Lopez-Sendon, Jose; Lyon, Alexander R; Malik, Fady I; McMurray, John J V; Metra, Marco; Figueroa Perez, Santiago; Pfeffer, Marc A; Pocock, Stuart J; Ponikowski, Piotr; Prasad, Krishna; Richard-Lordereau, Isabelle; Roessig, Lothar; Rosano, Giuseppe M C; Sherman, Warren; Stough, Wendy Gattis; Swedberg, Karl; Tyl, Benoit; Zannad, Faiez; Boulton, Caroline; De Graeff, Pieter

    2017-06-01

    Despite the availability of a number of different classes of therapeutic agents with proven efficacy in heart failure, the clinical course of heart failure patients is characterized by a reduction in life expectancy, a progressive decline in health-related quality of life and functional status, as well as a high risk of hospitalization. New approaches are needed to address the unmet medical needs of this patient population. The European Medicines Agency (EMA) is undertaking a revision of its Guideline on Clinical Investigation of Medicinal Products for the Treatment of Chronic Heart Failure. The draft version of the Guideline was released for public consultation in January 2016. The Cardiovascular Round Table of the European Society of Cardiology (ESC), in partnership with the Heart Failure Association of the ESC, convened a dedicated two-day workshop to discuss three main topic areas of major interest in the field and addressed in this draft EMA guideline: (i) assessment of efficacy (i.e. endpoint selection and statistical analysis); (ii) clinical trial design (i.e. issues pertaining to patient population, optimal medical therapy, run-in period); and (iii) research approaches for testing novel therapeutic principles (i.e. cell therapy). This paper summarizes the key outputs from the workshop, reviews areas of expert consensus, and identifies gaps that require further research or discussion. Collaboration between regulators, industry, clinical trialists, cardiologists, health technology assessment bodies, payers, and patient organizations is critical to address the ongoing challenge of heart failure and to ensure the development and market access of new therapeutics in a scientifically robust, practical and safe way. © 2017 The Authors. European Journal of Heart Failure © 2017 European Society of Cardiology.

  11. Relationship between sponsorship and failure rate of dental implants: a systematic approach.

    PubMed

    Popelut, Antoine; Valet, Fabien; Fromentin, Olivier; Thomas, Aurélie; Bouchard, Philippe

    2010-04-21

    The number of dental implant treatments increases annually. Dental implants are manufactured by competing companies. Systematic reviews and meta-analysis have shown a clear association between pharmaceutical industry funding of clinical trials and pro-industry results. So far, the impact of industry sponsorship on the outcomes and conclusions of dental implant clinical trials has never been explored. The aim of the present study was to examine financial sponsorship of dental implant trials, and to evaluate whether research funding sources may affect the annual failure rate. A systematic approach was used to identify systematic reviews published between January 1993 and December 2008 that specifically deal with the length of survival of dental implants. Primary articles were extracted from these reviews. The failure rate of the dental implants included in the trials was calculated. Data on publication year, Impact Factor, prosthetic design, periodontal status reporting, number of dental implants included in the trials, methodological quality of the studies, presence of a statistical advisor, and financial sponsorship were extracted by two independent reviewers (kappa = 0.90; CI(95%) [0.77-1.00]). Univariate quasi-Poisson regression models and multivariate analysis were used to identify variables that were significantly associated with failure rates. Five systematic reviews were identified from which 41 analyzable trials were extracted. The mean annual failure rate estimate was 1.09%.(CI(95%) [0.84-1.42]). The funding source was not reported in 63% of the trials (26/41). Sixty-six percent of the trials were considered as having a risk of bias (27/41). Given study age, both industry associated (OR = 0.21; CI(95%) [0.12-0.38]) and unknown funding source trials (OR = 0.33; (CI(95%) [0.21-0.51]) had a lower annual failure rates compared with non-industry associated trials. A conflict of interest statement was disclosed in 2 trials. When controlling for other factors, the probability of annual failure for industry associated trials is significantly lower compared with non-industry associated trials. This bias may have significant implications on tooth extraction decision making, research on tooth preservation, and governmental health care policies.

  12. Regional instability following cervicothoracic junction surgery.

    PubMed

    Steinmetz, Michael P; Miller, Jared; Warbel, Ann; Krishnaney, Ajit A; Bingaman, William; Benzel, Edward C

    2006-04-01

    The cervicothoracic junction (CTJ) is the transitional region between the cervical and thoracic sections of the spinal axis. Because it is a transitional zone between the mobile lordotic cervical and rigid kyphotic thoracic spines, the CTJ is a region of potential instability. This potential for instability may be exaggerated by surgical intervention. A retrospective review of all patients who underwent surgery involving the CTJ in the Department of Neurosurgery at the Cleveland Clinic Foundation during a 5-year period was performed. The CTJ was strictly defined as encompassing the C-7 vertebra and C7-T1 disc interspace. Patients were examined after surgery to determine if treatment had failed. Failure was defined as construct failure, deformity (progression or de novo), or instability. Variables possibly associated with treatment failure were analyzed. Statistical comparisons were performed using the Fisher exact test. Between January 1998 and November 2003, 593 CTJ operations were performed. Treatment failed in 14 patients. Of all variables studied, failure was statistically associated with laminectomy and multilevel ventral corpectomies with fusion across the CTJ. Other factors statistically associated with treatment failure included histories of cervical surgery, tobacco use, and surgery for the correction of deformity. The CTJ is a vulnerable region, and this vulnerability is exacerbated by surgery. Results of the present study indicate that laminectomy across the CTJ should be supplemented with instrumentation (and fusion). Multilevel ventral corpectomies across the CTJ should also be supplemented with dorsal instrumentation. Supplemental instrumentation should be considered for patients who have undergone prior cervical surgery, have a history of tobacco use, or are undergoing surgery for deformity correction.

  13. SES, Heart Failure, and N-terminal Pro-b-type Natriuretic Peptide: The Atherosclerosis Risk in Communities Study.

    PubMed

    Vart, Priya; Matsushita, Kunihiro; Rawlings, Andreea M; Selvin, Elizabeth; Crews, Deidra C; Ndumele, Chiadi E; Ballantyne, Christie M; Heiss, Gerardo; Kucharska-Newton, Anna; Szklo, Moyses; Coresh, Josef

    2018-02-01

    Compared with coronary heart disease and stroke, the association between SES and the risk of heart failure is less well understood. In 12,646 participants of the Atherosclerosis Risk in Communities Study cohort free of heart failure history at baseline (1987-1989), the association of income, educational attainment, and area deprivation index with subsequent heart failure-related hospitalization or death was examined while accounting for cardiovascular disease risk factors and healthcare access. Because SES may affect threshold of identifying heart failure and admitting for heart failure management, secondarily the association between SES and N-terminal pro-b-type natriuretic peptide (NT-proBNP) levels, a marker reflecting cardiac overload, was investigated. Analysis was conducted in 2016. During a median follow-up of 24.3 years, a total of 2,249 participants developed heart failure. In a demographically adjusted model, the lowest-SES group had 2.2- to 2.5-fold higher risk of heart failure compared with the highest SES group for income, education, and area deprivation. With further adjustment for time-varying cardiovascular disease risk factors and healthcare access, these associations were attenuated but remained statistically significant (e.g., hazard ratio=1.92, 95% CI=1.69, 2.19 for the lowest versus highest income), with no racial interaction (p>0.05 for all SES measures). Similarly, compared with high SES, low SES was associated with both higher baseline level of NT-proBNP in a multivariable adjusted model (15% higher, p<0.001) and increase over time (~1% greater per year, p=0.023). SES was associated with clinical heart failure as well as NT-proBNP levels inversely and independently of traditional cardiovascular disease factors and healthcare access. Copyright © 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  14. A statistical overview of mass movement characteristics on the North American Atlantic outer continental margin

    USGS Publications Warehouse

    Booth, James S.; O'Leary, Dennis W.

    1992-01-01

    An analysis of 179 mass movements on the North American Atlantic continental slope and upper rise shows that slope failures have occurred throughout the geographic extent of the outer margin. Although the slope failures show no striking affinity for a particular depth as an origination level, there is a broad, primary mode centered at about 900 m. The resulting slides terminate at almost all depths and have a primary mode at 1100 m, but the slope/rise boundary (at 2200 m) also is an important mode. Slope failures have occurred at declivities ranging from 1° to 30° (typically, 4°); the resultant mass movement deposits vary in width from 0.2 to 50 km (typically, 1-2 km) and in length from 0.3 to 380 km (typically, 2–4 km), and they have been reported to be as thick as 650 m. On a numeric basis, mass movements are slightly more prevalent on open slopes than in other physiographic settings, and both translational and rotational failure surfaces are common. The typical mass movement is disintegrative in nature. Open slope slides tend to occur at lower slope angles and are larger than canyon slides. Further, large‐scale slides rather than small‐scale slides tend to originate on gentle slopes (≍ 3-4°). Rotational slope failures appear to have a slightly greater chance of occurring in canyons, but there is no analogous bias associated with translational failures. Similarly, disintegrative slides seem more likely to be associated with rotational slope failures than translational ones and are longer than their nondisintegrative counterparts. The occurrence of such a variety of mass movements at low declivities implies that a regional failure mechanism has prevailed. We suggest that earthquakes or, perhaps in some areas, gas hydrates are the most likely cause of the slope failures.

  15. Heart failure after conventional metal-on-metal hip replacements

    PubMed Central

    Gillam, Marianne H; Pratt, Nicole L; Inacio, Maria C S; Roughead, Elizabeth E; Shakib, Sepehr; Nicholls, Stephen J; Graves, Stephen E

    2017-01-01

    Background and purpose — It is unclear whether metal particles and ions produced by mechanical wear and corrosion of hip prostheses with metal-on-metal (MoM) bearings have systemic adverse effects on health. We compared the risk of heart failure in patients with conventional MoM total hip arthroplasty (THA) and in those with metal-on-polyethylene (MoP) THA. Patients and methods — We conducted a retrospective cohort study using data from the Australian Government Department of Veterans’ Affairs health claims database on patients who received conventional THA for osteoarthritis between 2004 and 2012. The MoM THAs were classified into groups: Articular Surface Replacement (ASR) XL Acetabular System, other large-head (LH) (> 32 mm) MoM, and small-head (SH) (≤ 32 mm) MoM. The primary outcome was hospitalization for heart failure after THA. Results — 4,019 patients with no history of heart failure were included (56% women). Men with an ASR XL THA had a higher rate of hospitalization for heart failure than men with MoP THA (hazard ratio (HR) = 3.2, 95% CI: 1.6–6.5). No statistically significant difference in the rate of heart failure was found with the other LH MoM or SH MoM compared to MoP in men. There was no statistically significant difference in heart failure rate between exposure groups in women. Interpretation — An association between ASR XL and hospitalization for heart failure was found in men. While causality between ASR XL and heart failure could not be established in this study, it highlights an urgent need for further studies to investigate the possibility of systemic effects associated with MoM THA. PMID:27759468

  16. Patterns of failure after involved field radiotherapy for locally advanced esophageal squamous cell carcinoma.

    PubMed

    Li, Duo-Jie; Li, Hong-Wei; He, Bin; Wang, Geng-Ming; Cai, Han-Fei; Duan, Shi-Miao; Liu, Jing-Jing; Zhang, Ya-Jun; Cui, Zhen; Jiang, Hao

    2016-01-01

    To retrospectively analyze the patterns of failure and the treatment effects of involved-field irradiation (IFI) on patients treated with locally advanced esophageal squamous cell carcinoma (ESCC) and to determine whether IFI is practicable in these patients. A total of 79 patients with locally advanced ESCC underwent three dimensional conformal (3D)CRT) or intensity modulated radiotherapy (IMRT) using IFI or elective nodal irradiation (ENI) according to the target volume. The patterns of failure were defined as local/regional, in-field, out)of)field regional lymph node (LN) and distant failure. With a median follow)up of 32.0 months, failures were observed in 66 (83.6%) patients. The cumulative incidence of local/regional failure (55.8 vs 52.8%) and in)field regional lymph node failure (25.6 vs 19.4%) showed no statistically significant difference between the IFI and the ENI group (p=0.526 and 0.215, respectively). Out)of)field nodal relapse rate of only 7.0% was seen in the IFI group. Three)year survival rates for the ENI and IFI group were 22.2 and 18.6%, respectively (p=0.240), and 3)year distant metastasis rates were 27.8 and 32.6%, respectively (p=0.180). The lung V10, V20, V30 and mean lung dose of the ENI group were greater than those of the IFI group, while the mean lung dose and V10 had statistically significant difference. The patterns of failure and survival rates in the IFI group were similar as in the ENI group; the regional recurrence and distant metastasis are the main cause of treatment failure. IFI is feasible for locally advanced ESCC. Further investigation is needed to increase local control and decrease distant metastasis in these patients.

  17. Implant failure in lower limb long bone diaphyseal fractures at a tertiary hospital in Ile- Ife. Nigeria.

    PubMed

    Esan, O; Ikem, I C; Orimolade, E A; Esan, O T

    2014-06-01

    This included determining aetiology of failure and comparing the failure rate in implant fixations using solid intramedullary nail and DCP. A retrospective study conducted at the Orthopaedic Department, Obafemi Awolowo University Teaching Hospital, Ile-Ife,Nigeria. Records of all operated cases of lower limb long bone diaphyseal fractures including those with failed fixations from August 2006-July 2011 were reviewed. Data retrieved included type of implant used, aetiology and characteristics of Implant failure. Data were analysed using SPSS version 16. Frequency distribution of the variables of interest was done. Difference in failure rate of intramedullary nail versus DCP was tested using chi-square. Statistical significance was inferred at p<0.05. A total of 280 patients were studied out of which two hundred and twenty-one patients had long bone diaphyseal fractures and met inclusion criteria, of which 135 had intramedullary nail fixation and 86 had DCP. The rate of implant failure in intramedullary nail was 1.5% while it was 5.8% in patients with DCP (p=0.113; 0R=4.10; 95% CI=0.65- 43.77). Implant fracture was the commonest type of failure seen (100% versus 60%) and non union was the commonest cause of failure seen (50% versus 40%) in the intramedullary nailing and DCP groups respectively. The likelihood of a failed implant is higher in fixations done with DCP compared with intramedullary nail though the difference was not statistically significant. Commonest reason for failure in both groups was non-union. Findings from this study may guide surgeons in choice of implant in the management of long bone fractures.

  18. Surface modification for bonding between amalgam and orthodontic brackets.

    PubMed

    Wongsamut, Wittawat; Satrawaha, Sirichom; Wayakanon, Kornchanok

    2017-01-01

    Testing of methods to enhance the shear bond strength (SBS) between orthodontic metal brackets and amalgam by sandblasting and different primers. Three hundred samples of amalgam restorations (KerrAlloy ® ) were prepared in self-cured acrylic blocks, polished, and divided into two groups: nonsandblasted and sandblasted. Each group was divided into five subgroups with different primers used in surface treatment methods, with a control group of bonded brackets on human mandibular incisors. Following the surface treatments, mandibular incisor brackets (Unitek ® ) were bonded on the amalgam with adhesive resin (Transbond XT ® ). The SBS of the samples was tested. The adhesive remnant index (ARI) and failure modes were then determined under a stereo-microscope. Two-way analysis of variance, Chi-square, and Kruskal-Wallis tests were performed to calculate the correlations between and among the SBS and ARI values, the failure modes, and surface roughness results. There were statistically significant differences of SBS among the different adhesive primers and sandblasting methods ( P < 0.05). The sandblasted amalgam with Assure Plus ® showed the highest SBS ( P < 0.001). Samples mainly showed an ARI score = 1 and mix-mode failure. There was a statistically significant difference of surface roughness between nonsandblasted amalgam and sandblasted amalgam ( P < 0.05), but no significant differences among priming agents ( P > 0.05). Using adhesive primers with sandblasting together effectively enhances the SBS between orthodontic metal brackets and amalgam. The two primers with the ingredient methacryloxydecyl dihydrogen phosphate (MDP) monomer, Alloy Primer ® and Assure Plus ® , were the most effective. Including sandblasting in the treatment is essential to achieve the bonding strength required.

  19. Pericentric Inversion of Human Chromosome 9 Epidemiology Study in Czech Males and Females.

    PubMed

    Šípek, A; Panczak, A; Mihalová, R; Hrčková, L; Suttrová, E; Sobotka, V; Lonský, P; Kaspříková, N; Gregor, V

    2015-01-01

    Pericentric inversion of human chromosome 9 [inv(9)] is a relatively common cytogenetic finding. It is largely considered a clinically insignificant variant of the normal human karyotype. However, numerous studies have suggested its possible association with certain pathologies, e.g., infertility, habitual abortions or schizophrenia. We analysed the incidence of inv(9) and the spectrum of clinical indications for karyotyping among inv(9) carriers in three medical genetics departments in Prague. In their cytogenetic databases, among 26,597 total records we identified 421 (1.6 %) cases of inv(9) without any concurrent cytogenetic pathology. This study represents the world's largest epidemiological study on inv(9) to date. The incidence of inv(9) calculated in this way from diagnostic laboratory data does not differ from the incidence of inv(9) in three specific populationbased samples of healthy individuals (N = 4,166) karyotyped for preventive (amniocentesis for advanced maternal age, gamete donation) or legal reasons (children awaiting adoption). The most frequent clinical indication in inv(9) carriers was "idiopathic reproductive failure" - 37.1 %. The spectra and percentages of indications in individuals with inv(9) were further statistically evaluated for one of the departments (N = 170) by comparing individuals with inv(9) to a control group of 661 individuals with normal karyotypes without this inversion. The proportion of clinical referrals for "idiopathic reproductive failure" among inv(9) cases remains higher than in controls, but the difference is not statistically significant for both genders combined. Analysis in separated genders showed that the incidence of "idiopathic reproductive failure" could differ among inv(9) female and male carriers.

  20. The Role of Omega-3 Polyunsaturated Fatty Acids in Heart Failure: A Meta-Analysis of Randomised Controlled Trials

    PubMed Central

    Wang, Chunbin; Xiong, Bo; Huang, Jing

    2016-01-01

    Many new clinical trials about the effect of omega-3 polyunsaturated fatty acids (PUFAs) in heart failure (HF) patients have shown inconsistent results. Therefore, a meta-analysis of randomised controlled trials (RCTs) was performed to determine the benefits of omega-3 PUFAs in HF patients. Articles were obtained from PubMed, EMBASE, and the Cochrane Library. RCTs comparing omega-3 PUFAs with placebo for HF were included. Two reviewers independently extracted the data from the selected publications. The I2 statistic was used to assess heterogeneity. The pooled mean difference and associated 95% confidence intervals were calculated, and a fixed or random-effects model was used for the meta-analysis. A total of nine RCTs involving 800 patients were eligible for inclusion. Compared with patients taking placebo, HF patients who received omega-3 PUFAs experienced decreased brain natriuretic peptide levels and serum norepinephrine levels. Although the left ventricular ejection fraction (LVEF) and clinical outcomes (Tei index, peak oxygen consumption) did not improve, subgroup analysis showed that the LVEF increased in dilated cardiomyopathy (DCM) patients. Overall, omega-3 PUFA supplements might be beneficial in HF patients, especially in DCM patients, but further studies are needed to confirm these benefits. PMID:28042816

  1. The Role of Omega-3 Polyunsaturated Fatty Acids in Heart Failure: A Meta-Analysis of Randomised Controlled Trials.

    PubMed

    Wang, Chunbin; Xiong, Bo; Huang, Jing

    2016-12-30

    Many new clinical trials about the effect of omega-3 polyunsaturated fatty acids (PUFAs) in heart failure (HF) patients have shown inconsistent results. Therefore, a meta-analysis of randomised controlled trials (RCTs) was performed to determine the benefits of omega-3 PUFAs in HF patients. Articles were obtained from PubMed, EMBASE, and the Cochrane Library. RCTs comparing omega-3 PUFAs with placebo for HF were included. Two reviewers independently extracted the data from the selected publications. The I ² statistic was used to assess heterogeneity. The pooled mean difference and associated 95% confidence intervals were calculated, and a fixed or random-effects model was used for the meta-analysis. A total of nine RCTs involving 800 patients were eligible for inclusion. Compared with patients taking placebo, HF patients who received omega-3 PUFAs experienced decreased brain natriuretic peptide levels and serum norepinephrine levels. Although the left ventricular ejection fraction (LVEF) and clinical outcomes (Tei index, peak oxygen consumption) did not improve, subgroup analysis showed that the LVEF increased in dilated cardiomyopathy (DCM) patients. Overall, omega-3 PUFA supplements might be beneficial in HF patients, especially in DCM patients, but further studies are needed to confirm these benefits.

  2. Derailment-based Fault Tree Analysis on Risk Management of Railway Turnout Systems

    NASA Astrophysics Data System (ADS)

    Dindar, Serdar; Kaewunruen, Sakdirat; An, Min; Gigante-Barrera, Ángel

    2017-10-01

    Railway turnouts are fundamental mechanical infrastructures, which allow a rolling stock to divert one direction to another. As those are of a large number of engineering subsystems, e.g. track, signalling, earthworks, these particular sub-systems are expected to induce high potential through various kind of failure mechanisms. This could be a cause of any catastrophic event. A derailment, one of undesirable events in railway operation, often results, albeit rare occurs, in damaging to rolling stock, railway infrastructure and disrupt service, and has the potential to cause casualties and even loss of lives. As a result, it is quite significant that a well-designed risk analysis is performed to create awareness of hazards and to identify what parts of the systems may be at risk. This study will focus on all types of environment based failures as a result of numerous contributing factors noted officially as accident reports. This risk analysis is designed to help industry to minimise the occurrence of accidents at railway turnouts. The methodology of the study relies on accurate assessment of derailment likelihood, and is based on statistical multiple factors-integrated accident rate analysis. The study is prepared in the way of establishing product risks and faults, and showing the impact of potential process by Boolean algebra.

  3. Validation of accelerated ageing of Thales rotary Stirling cryocoolers for the estimation of MTTF

    NASA Astrophysics Data System (ADS)

    Seguineau, C.,; Cauquil, J.-M.; Martin, J.-Y.; Benschop, T.

    2016-05-01

    The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The current market needs tend to reliability figures higher than 15,000hrs in "standard conditions". Field returns are hardly useable mostly because of the uncertain environmental conditions of use, or the differences in user profiles. A previous paper explains how Thales Cryogenics has developed an approach based on accelerated ageing and statistical analysis [1]. The aim of the current paper is to compare results obtained on accelerated ageing on one side, and on the other side, specific field returns where the conditions of use are well known. The comparison between prediction and effective failure rate is discussed. Moreover, a specific focus is done on how some new applications of cryocoolers (continuous operation at a specific temperature) can increase the MTTF. Some assumptions are also exposed on how the failure modes, effects and criticality analysis evolves for continuous operation at a specific temperature and compared to experimental data.

  4. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  5. Principle of maximum entropy for reliability analysis in the design of machine components

    NASA Astrophysics Data System (ADS)

    Zhang, Yimin

    2018-03-01

    We studied the reliability of machine components with parameters that follow an arbitrary statistical distribution using the principle of maximum entropy (PME). We used PME to select the statistical distribution that best fits the available information. We also established a probability density function (PDF) and a failure probability model for the parameters of mechanical components using the concept of entropy and the PME. We obtained the first four moments of the state function for reliability analysis and design. Furthermore, we attained an estimate of the PDF with the fewest human bias factors using the PME. This function was used to calculate the reliability of the machine components, including a connecting rod, a vehicle half-shaft, a front axle, a rear axle housing, and a leaf spring, which have parameters that typically follow a non-normal distribution. Simulations were conducted for comparison. This study provides a design methodology for the reliability of mechanical components for practical engineering projects.

  6. Longevity of metal-ceramic crowns cemented with self-adhesive resin cement: a prospective clinical study

    PubMed

    Brondani, Lucas Pradebon; Pereira-Cenci, Tatiana; Wandsher, Vinicius Felipe; Pereira, Gabriel Kalil; Valandro, Luis Felipe; Bergoli, César Dalmolin

    2017-04-10

    Resin cements are often used for single crown cementation due to their physical properties. Self-adhesive resin cements gained widespread due to their simplified technique compared to regular resin cement. However, there is lacking clinical evidence about the long-term behavior of this material. The aim of this prospective clinical trial was to assess the survival rates of metal-ceramic crowns cemented with self-adhesive resin cement up to six years. One hundred and twenty-nine subjects received 152 metal-ceramic crowns. The cementation procedures were standardized and performed by previously trained operators. The crowns were assessed as to primary outcome (debonding) and FDI criteria. Statistical analysis was performed using Kaplan-Meier statistics and descriptive analysis. Three failures occurred (debonding), resulting in a 97.6% survival rate. FDI criteria assessment resulted in scores 1 and 2 (acceptable clinical evaluation) for all surviving crowns. The use of self-adhesive resin cement is a feasible alternative for metal-ceramic crowns cementation, achieving high and adequate survival rates.

  7. The influence of depression and anxiety in the development of heart failure after coronary angioplasty.

    PubMed

    Gegenava, T; Gegenava, M; Kavtaradze, G

    2009-03-01

    The aim of our study was to investigate the association between history of depressive episode and anxiety and complications in patients after 6 months of coronary artery angioplasty. The research was conducted on 70 patients, the grade of coronary occlusion that would not respond to therapeutic treatment and need coronary angioplasty had been established. Complications were estimated in 60 patients after 6 months of coronary angioplasty. To evaluate depression we used Beck depression scale Anxiety was assessed by Spilberger State-trait anxiety scale. Statistic analysis of the data was made by means of the methods of variation statistics using Students' criterion and program of STATISTICA w 5.0. Complications were discovered in 36 (60%) patients; 24 (40%) patients had not complications. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. There was not revealed significant statistical differences in depression and anxiety degree in coronary angioplasty period and after 6 months of coronary angioplasty. Our study demonstrated that complications were revealed in patients who had high degree of depression and anxiety.

  8. [Non-operative treatment for severe forms of infantile idiopathic scoliosis].

    PubMed

    Trobisch, P D; Samdani, A; O'Neil, C; Betz, R; Cahill, P

    2012-02-01

    Infantile idiopathic scoliosis (IIS) is a rare orthopaedic condition. Braces and casts are popular options in the treatment of IIS but there is a paucity of studies commenting on the outcome of non-operative treatment. The purpose of this study was to analyse failure and success after non-operative treatment for severe forms of IIS. We retrospectively reviewed the data of all children who had been treated for IIS between 2003 and 2009 at a single institution. After calculating the failure and success rates, we additionally performed a risk factor analysis for patients who failed non-operative treatment. Chi (2) and T tests were used for statistical analysis with significance set at p < 0.05. 25 children with an average age of 11 months and an Cobb angle of 46 degrees at presentation were analysed. Seven (28 %) patients were considered as having failed non-operative treatment after an average follow-up of 28 months. The pretreatment Cobb angle was identified as single significant risk factor for failure (55 versus 42) while neither age, gender, nor RVAD seem to influence the outcome. In children who were considered as successfully treated, the Cobb angle decreased from 42 to 18 degrees. Non-operative treatment for IIS is successful in 3 out of 4 patients. © Georg Thieme Verlag KG Stuttgart · New York.

  9. SEM and fractography analysis of screw thread loosening in dental implants.

    PubMed

    Scarano, A; Quaranta, M; Traini, T; Piattelli, M; Piattelli, A

    2007-01-01

    Biological and technical failures of implants have already been reported. Mechanical factors are certainly of importance in implant failures, even if their exact nature has not yet been established. The abutment screw fracture or loosening represents a rare, but quite unpleasant failure. The aim of the present research is an analysis and structural examination of screw thread or abutment loosening compared with screw threads or abutment without loosening. The loosening of screw threads was compared to screw thread without loosening of three different implant systems; Branemark (Nobel Biocare, Gothenburg, Sweden), T.B.R. implant systems (Benax, Ancona, Italy) and Restore (Lifecore Biomedical, Chaska, Minnesota, USA). In this study broken screws were excluded. A total of 16 screw thread loosenings were observed (Group I) (4 Branemark, 4 T.B.R and 5 Restore), 10 screw threads without loosening were removed (Group II), and 6 screw threads as received by the manufacturer (unused) (Group III) were used as control (2 Branemark, 2 T.B.R and 2 Restore). The loosened abutment screws were retrieved and analyzed under SEM. Many alterations and deformations were present in concavities and convexities of screw threads in group I. No macroscopic alterations or deformations were observed in groups II and III. A statistical difference of the presence of microcracks were observed between screw threads with an abutment loosening and screw threads without an abutment loosening.

  10. Identification and classification of failure modes in laminated composites by using a multivariate statistical analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Baccar, D.; Söffker, D.

    2017-11-01

    Acoustic Emission (AE) is a suitable method to monitor the health of composite structures in real-time. However, AE-based failure mode identification and classification are still complex to apply due to the fact that AE waves are generally released simultaneously from all AE-emitting damage sources. Hence, the use of advanced signal processing techniques in combination with pattern recognition approaches is required. In this paper, AE signals generated from laminated carbon fiber reinforced polymer (CFRP) subjected to indentation test are examined and analyzed. A new pattern recognition approach involving a number of processing steps able to be implemented in real-time is developed. Unlike common classification approaches, here only CWT coefficients are extracted as relevant features. Firstly, Continuous Wavelet Transform (CWT) is applied to the AE signals. Furthermore, dimensionality reduction process using Principal Component Analysis (PCA) is carried out on the coefficient matrices. The PCA-based feature distribution is analyzed using Kernel Density Estimation (KDE) allowing the determination of a specific pattern for each fault-specific AE signal. Moreover, waveform and frequency content of AE signals are in depth examined and compared with fundamental assumptions reported in this field. A correlation between the identified patterns and failure modes is achieved. The introduced method improves the damage classification and can be used as a non-destructive evaluation tool.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  12. Assessment of ECG and respiration recordings from simulated emergency landings of ultra light aircraft.

    PubMed

    Bruna, Ondřej; Levora, Tomáš; Holub, Jan

    2018-05-08

    Pilots of ultra light aircraft have limited training resources, but with the use of low cost simulators it might be possible to train and test some parts of their training on the ground. The purpose of this paper is to examine possibility of stress inducement on a low cost flight simulator. Stress is assessed from electrocardiogram and respiration. Engine failure during flight served as a stress inducement stimuli. For one flight, pilots had access to an emergency navigation system. There were recorded some statistically significant changes in parameters regarding breathing frequency. Although no significant change was observed in ECG parameters, there appears to be an effect on respiration parameters. Physiological signals processed with analysis of variance suggest, that the moment of engine failure and approach for landing affected average breathing frequency. Presence of navigation interface does not appear to have a significant effect on pilots.

  13. Interlaboratory study for nickel alloy 625 made by laser powder bed fusion to quantify mechanical property variability.

    PubMed

    Brown, Christopher U; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan

    2016-08-01

    Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser-powder-bed-fusion additive manufacturing machines. The tensile specimens were heat treated and tensile tests conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to 4 times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure.

  14. Can arthroscopic revision surgery for shoulder instability be a fair option?

    PubMed

    De Giorgi, Silvana; Garofalo, Raffaele; Tafuri, Silvio; Cesari, Eugenio; Rose, Giacomo Delle; Castagna, Alessandro

    2014-04-01

    the aim of this study was to evaluate the role of arthroscopic capsuloplasty in the treatment of failed primary arthroscopic treatment of glenohumeral instability. we retrospectively examined at a minimum of 3-years follow-up 22 patients who underwent arthroscopic treatment between 1999 and 2007 who had recurrent anterior shoulder instability with a post-surgical failure. A statistical analysis was performed to evaluate which variable could influence the definitive result and clinical outcomes at final follow-up. A p value of less than 0.05 was considered significant. we observed after revision surgery an overall failure rate of 8/22 (36.4%) including frank dislocations, subluxations and also apprehension that seriously inhibit the patient's quality of life. No significant differences were observed in the examined parameters. according to our outcomes we generally do not recommend an arthroscopic revision procedure for failed instability surgery.

  15. Interlaboratory study for nickel alloy 625 made by laser powder bed fusion to quantify mechanical property variability

    PubMed Central

    Brown, Christopher U.; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan

    2017-01-01

    Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser-powder-bed-fusion additive manufacturing machines. The tensile specimens were heat treated and tensile tests conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to 4 times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure. PMID:28243032

  16. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  17. Interlaboratory Study for Nickel Alloy 625 Made by Laser Powder Bed Fusion to Quantify Mechanical Property Variability

    NASA Astrophysics Data System (ADS)

    Brown, Christopher U.; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan

    2016-08-01

    Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser powder bed fusion-additive manufacturing machines. The tensile specimens were heat treated and tensile tests were conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to four times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure.

  18. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

  19. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  20. [Benefits of Measures to Promote Development in Language, Mathematics and Singing in Kindergardeners: Analysis of Data Collected at School Entrance Examination in the County of Biberach].

    PubMed

    Hart, Ulrike; Wildner, Manfred; Krämer, Daniela; Crispin, Alexander

    2018-02-01

    To evaluate the benefits of implementing measures to promote skills in the areas of language, mathematics and singing in kindergardeners by statistical analysis of data collected during the school entrance examination (ESU) of 4-5-year-old children from the county of Biberach. Study 1 employs multivariate regression analysis to analyse - in chronological order - the ESU data on 4 cohorts (2011-2014; n=7 148) of children of the Biberach county. Study 2 qualitatively compares identical data representative of the entire state of Baden-Württemberg (N=3×80 000) with the Biberach results. Study 3 focuses on the cohort 2014 in Biberach county (n=1 783) and employs logistical regression techniques to correlate curriculum content and child development. There are significant performance improvements in the Biberach population (2011-2014) in the development of language and early mathematics, as well as in visual comprehension and visuomotor skills, but not in the area of gross motor skills. Similar improvements are much more difficult to demonstrate for the entire state of Baden-Württemberg. The detailed analysis of the 2014 Biberach County data reveal that kindergardeners with increased exposure to mathematics will have a decreased risk of failure in early mathematics (OR 0.72) and grammar skills (OR 0.53-0.75). Children with speech impairment or children not fluent in German that had extra language tutorials, typically in small groups and 4 times a week for 30 min, still have a higher risk of failure in all developmental aspects, save gross motor skills (e. g. OR 3.32 in grammar skills, OR 3.08 for hyperactivity). Programs with emphasis on singing have little effect on the above data. The risk of failure in German language is high (OR 2.78) for those of non-German backgrounds, but less in visuomotor skills (OR 0.52) and hyperactivity (OR 0.51). Statistical analyses show positive correlation of curriculum content and early child development for the kindergardens in Biberach county. The gains in performance are consistent with those reported from kindergardens known for pedagogical excellence. © Georg Thieme Verlag KG Stuttgart · New York.

  1. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for detecting abrupt changes (such as failures) in stochastic dynamical systems are surveyed. The class of linear systems is concentrated on but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  2. Is it beneficial to approximate pre-failure topography to predict landslide susceptibility with empirical models?

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Schmaltz, Elmar; Glade, Thomas

    2017-04-01

    Empirical landslide susceptibility maps spatially depict the areas where future slope failures are likely due to specific environmental conditions. The underlying statistical models are based on the assumption that future landsliding is likely to occur under similar circumstances (e.g. topographic conditions, lithology, land cover) as past slope failures. This principle is operationalized by applying a supervised classification approach (e.g. a regression model with a binary response: landslide presence/absence) that enables discrimination between conditions that favored past landslide occurrences and the circumstances typical for landslide absences. The derived empirical relation is then transferred to each spatial unit of an area. Literature reveals that the specific topographic conditions representative for landslide presences are frequently extracted from derivatives of digital terrain models at locations were past landslides were mapped. The underlying morphology-based landslide identification becomes possible due to the fact that the topography at a specific locality usually changes after landslide occurrence (e.g. hummocky surface, concave and steep scarp). In a strict sense, this implies that topographic predictors used within conventional statistical landslide susceptibility models relate to post-failure topographic conditions - and not to the required pre-failure situation. This study examines the assumption that models calibrated on the basis of post-failure topographies may not be appropriate to predict future landslide locations, because (i) post-failure and pre-failure topographic conditions may differ and (ii) areas were future landslides will occur do not yet exhibit such a distinct post-failure morphology. The study was conducted for an area located in the Walgau region (Vorarlberg, western Austria), where a detailed inventory consisting of shallow landslides was available. The methodology comprised multiple systematic comparisons of models generated on the basis of post-failure conditions (i.e. the standard approach) with models based on an approximated pre-failure topography. Pre-failure topography was approximated by (i) erasing the area of mapped landslide polygons within a digital terrain model and (ii) filling these "empty" areas by interpolating elevation points located outside the mapped landslides. Landslide presence information was extracted from the respective landslide scarp locations while an equal number of randomly sampled points represented landslide absences. After an initial exploratory data analysis, mixed-effects logistic regression was applied to model landslide susceptibility on the basis of two predictor sets (post-failure versus pre-failure predictors). Furthermore, all analyses were separately conducted for five different modelling resolutions to elaborate the suspicion that the degree of generalization of topographic parameters may as well play a role on how the respective models may differ. Model evaluation was conducted by means of multiple procedures (i.e. odds ratios, k-fold cross validation, permutation-based variable importance, difference maps of predictions). The results revealed that models based on highest resolutions (e.g. 1 m, 2.5 m) and post-failure topography performed best from a purely quantitative perspective. A confrontation of models (post-failure versus pre-failure based models) based on an identical modelling resolution exposed that validation results, modelled relationships as well as the prediction pattern tended to converge with a decreasing raster resolution. Based on the results, we concluded that an approximation of pre-failure topography does not significantly contribute to improved landslide susceptibility models in the case (i) the underlying inventory consists of small landslide features and (ii) the models are based on coarse raster resolutions (e.g. 25 m). However, in the case modelling with high raster resolutions is envisaged (e.g. 1 m, 2.5 m) or the inventory mainly consists of larger events, a reconstruction of pre-failure conditions might be highly expedient, even though conventional validation results might indicate an opposite tendency. Finally, we recommend to consider that topographic predictors highly useful to detect past slope movements (e.g. roughness) are not necessarily valuable to predict future slope instabilities.

  3. Exploring partners' perspectives on participation in heart failure home care: a mixed-method design.

    PubMed

    Näsström, Lena; Luttik, Marie Louise; Idvall, Ewa; Strömberg, Anna

    2017-05-01

    To describe the partners' perspectives on participation in the care for patients with heart failure receiving home care. Partners are often involved in care of patients with heart failure and have an important role in improving patients' well-being and self-care. Partners have described both negative and positive experiences of involvement, but knowledge of how partners of patients with heart failure view participation in care when the patient receives home care is lacking. A convergent parallel mixed-method design was used, including data from interviews and questionnaires. A purposeful sample of 15 partners was used. Data collection lasted between February 2010 - December 2011. Interviews were analysed with content analysis and data from questionnaires (participation, caregiving, health-related quality of life, depressive symptoms) were analysed statistically. Finally, results were merged, interpreted and labelled as comparable and convergent or as being inconsistent. Partners were satisfied with most aspects of participation, information and contact. Qualitative findings revealed four different aspects of participation: adapting to the caring needs and illness trajectory, coping with caregiving demands, interacting with healthcare providers and need for knowledge to comprehend the health situation. Results showed confirmatory results that were convergent and expanded knowledge that gave a broader understanding of partner participation in this context. The results revealed different levels of partner participation. Heart failure home care included good opportunities for both participation and contact during home visits, necessary to meet partners' ongoing need for information to comprehend the situation. © 2016 John Wiley & Sons Ltd.

  4. Comparison of clinical outcomes and genomic characteristics of single focus and multifocal glioblastoma

    PubMed Central

    Paulsson, Anna K.; Holmes, Jordan A.; Peiffer, Ann M.; Miller, Lance D.; Liu, Wennuan; Xu, Jianfeng; Hinson, William H.; Lesser, Glenn J.; Laxton, Adrian W.; Tatter, Stephen B.; Debinski, Waldemar

    2014-01-01

    We investigate the differences in molecular signature and clinical outcomes between multiple lesion glioblastoma (GBM) and single focus GBM in the modern treatment era. Between August 2000 and May 2010, 161 patients with GBM were treated with modern radiotherapy techniques. Of this group, 33 were considered to have multiple lesion GBM (25 multifocal and 8 multicentric). Patterns of failure, time to progression and overall survival were compared based on whether the tumor was considered a single focus or multiple lesion GBM. Genomic groupings and methylation status were also investigated as a possible predictor of multifocality in a cohort of 41 patients with available tissue for analysis. There was no statistically significant difference in overall survival (p < 0.3) between the multiple lesion tumors (8.2 months) and single focus GBM (11 months). Progression free survival was superior in the single focus tumors (7.1 months) as compared to multi-focal (5.6 months, p = 0.02). For patients with single focus, multifocal and multicentric GBM, 81, 76 and 88 % of treatment failures occurred in the 60 Gy volume (p < 0.5), while 54, 72, and 38 % of treatment failures occurred in the 46 Gy volume (p < 0.4). Out of field failures were rare in both single focus and multiple foci GBM (7 vs 3 %). Genomic groupings and methylation status were not found to predict for multifocality. Patterns of failure, survival and genomic signatures for multiple lesion GBM do not appreciably differ when compared to single focus tumors. PMID:24990827

  5. Correlation between inner strength and health-promoting behaviors in women with heart failure.

    PubMed

    Hosseini, Meimanat; Vasli, Parvaneh; Rashidi, Sakineh; Shahsavari, Soodeh

    2016-08-01

    Inner strength is a factor for mental health and well-being and, consequently, a dynamic component of holistic healing. Health-promoting behaviors are appropriate activities to improve health status and prevent the progression of the functional defect resulting from heart failure. The present study aimed to determine the correlation between inner strength and health-promoting behaviors in women with heart failure referred to hospitals affiliated with Shahid Beheshti University of Medical Sciences (SBMU) in 2013. In this cross-sectional study, 145 women with hearth failure were selected through convenient sampling from the clients referred to hospitals affiliated with SBMU. The data collection tool included a three-section questionnaire of personal characteristics, inner strength, and health-promoting life profile II (HPLP II). The data analysis used descriptive statistical tests and Pearson correlation coefficient through SPSS version 20. A direct significant correlation was found between inner strength and all dimensions of health-promoting behaviors and overall health-promoting behaviors (p=0.000) as well as between all dimensions of inner strength (except for the dimension of knowing and searching with physical activity and the dimension of connectedness with personal accountability in healthcare as well as connectedness with physical activity) with health-promoting behaviors (p=0.000 to p=0008). To improve the level of health and well-being and reduce the costs of care services in women with health failure, close attention should be paid to developing and empowering their inner strength.

  6. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  7. Survivorship of standard versus modified posterior surgical approaches in metal-on-metal hip resurfacing.

    PubMed

    M Takamura, K; Maher, P; Nath, T; Su, E P

    2014-05-01

    Metal-on-metal hip resurfacing (MOMHR) is available as an alternative option for younger, more active patients. There are failure modes that are unique to MOMHR, which include loosening of the femoral head and fractures of the femoral neck. Previous studies have speculated that changes in the vascularity of the femoral head may contribute to these failure modes. This study compares the survivorship between the standard posterior approach (SPA) and modified posterior approach (MPA) in MOMHR. A retrospective clinical outcomes study was performed examining 351 hips (279 male, 72 female) replaced with Birmingham Hip Resurfacing (BHR, Smith and Nephew, Memphis, Tennessee) in 313 patients with a pre-operative diagnosis of osteoarthritis. The mean follow-up period for the SPA group was 2.8 years (0.1 to 6.1) and for the MPA, 2.2 years (0.03 to 5.2); this difference in follow-up period was statistically significant (p < 0.01). Survival analysis was completed using the Kaplan-Meier method. At four years, the Kaplan-Meier survival curve for the SPA was 97.2% and 99.4% for the MPA; this was statistically significant (log-rank; p = 0.036). There were eight failures in the SPA and two in the MPA. There was a 3.5% incidence of femoral head collapse or loosening in the SPA and 0.4% in the MPA, which represented a significant difference (p = 0.041). There was a 1.7% incidence of fractures of the femoral neck in the SPA and none in the MPA (p = 0.108). This study found a significant difference in survivorship at four years between the SPA and the MPA (p = 0.036). The clinical outcomes of this study suggest that preserving the vascularity of the femoral neck by using the MPA results in fewer vascular-related failures in MOMHRs. Cite this article: Bone Joint Res 2014;3:150-4. ©2014 The British Editorial Society of Bone & Joint Surgery.

  8. Survivorship of standard versus modified posterior surgical approaches in metal-on-metal hip resurfacing

    PubMed Central

    M. Takamura, K.; Maher, P.; Nath, T.; Su, E. P.

    2014-01-01

    Objectives Metal-on-metal hip resurfacing (MOMHR) is available as an alternative option for younger, more active patients. There are failure modes that are unique to MOMHR, which include loosening of the femoral head and fractures of the femoral neck. Previous studies have speculated that changes in the vascularity of the femoral head may contribute to these failure modes. This study compares the survivorship between the standard posterior approach (SPA) and modified posterior approach (MPA) in MOMHR. Methods A retrospective clinical outcomes study was performed examining 351 hips (279 male, 72 female) replaced with Birmingham Hip Resurfacing (BHR, Smith and Nephew, Memphis, Tennessee) in 313 patients with a pre-operative diagnosis of osteoarthritis. The mean follow-up period for the SPA group was 2.8 years (0.1 to 6.1) and for the MPA, 2.2 years (0.03 to 5.2); this difference in follow-up period was statistically significant (p < 0.01). Survival analysis was completed using the Kaplan–Meier method. Results At four years, the Kaplan–Meier survival curve for the SPA was 97.2% and 99.4% for the MPA; this was statistically significant (log-rank; p = 0.036). There were eight failures in the SPA and two in the MPA. There was a 3.5% incidence of femoral head collapse or loosening in the SPA and 0.4% in the MPA, which represented a significant difference (p = 0.041). There was a 1.7% incidence of fractures of the femoral neck in the SPA and none in the MPA (p = 0.108). Conclusion This study found a significant difference in survivorship at four years between the SPA and the MPA (p = 0.036). The clinical outcomes of this study suggest that preserving the vascularity of the femoral neck by using the MPA results in fewer vascular-related failures in MOMHRs. Cite this article: Bone Joint Res 2014;3:150–4 PMID:24842931

  9. Differential diagnosis of cardiovascular diseases and T-wave alternans

    NASA Astrophysics Data System (ADS)

    Ramasamy, Mouli; Varadan, Vijay K.

    2016-04-01

    T wave alternans (TWA) is the variation of the T-wave in electrocardiogram that is observed between periodic beats. TWA is one of the important precursors used to diagnose sudden cardiac death (SCD). Several clinical studies have tried to determine the significance of using TWA analysis to detect abnormalities that may lead to Ventricular Arrhythmias, as well as establish metrics to perform risk stratification for cardiovascular patients with prior cardiac episodes. The statistical significance of TWA in predicting ventricular arrhythmias has been established in patients across several diagnoses. Studies have also shown the significance of the predictive value of TWA analysis in post myocardial infarction patients, risk of SCD, congestive heart failure, ischemic cardiomyopathy, and Chagas disease.

  10. A comparative analysis of the results from 4 trials of beta-blocker therapy for heart failure: BEST, CIBIS-II, MERIT-HF, and COPERNICUS.

    PubMed

    Domanski, Michael J; Krause-Steinrauf, Heidi; Massie, Barry M; Deedwania, Prakash; Follmann, Dean; Kovar, David; Murray, David; Oren, Ron; Rosenberg, Yves; Young, James; Zile, Michael; Eichhorn, Eric

    2003-10-01

    Recent large randomized, controlled trials (BEST [Beta-blocker Evaluation of Survival Trial], CIBIS-II [Cardiac Insufficiency Bisoprolol Trial II], COPERNICUS [Carvedilol Prospective Randomized Cumulative Survival Study], and MERIT-HF [Metoprolol Randomized Intervention Trial in Congestive Heart Failure]) have addressed the usefulness of beta-blockade in the treatment of advanced heart failure. CIBIS-II, COPERNICUS, and MERIT-HF have shown that beta-blocker treatment with bisoprolol, carvedilol, and metoprolol XL, respectively, reduce mortality in advanced heart failure patients, whereas BEST found a statistically nonsignificant trend toward reduced mortality with bucindolol. We conducted a post hoc analysis to determine whether the response to beta-blockade in BEST could be related to differences in the clinical and demographic characteristics of the study populations. We generated a sample from BEST to resemble the patient cohorts studied in CIBIS-II and MERIT-HF to find out whether the response to beta-blocker therapy was similar to that reported in the other trials. These findings are further compared with COPERNICUS, which entered patients with more severe heart failure. To achieve conformity with the entry criteria for CIBIS-II and MERIT-HF, the BEST study population was adjusted to exclude patients with systolic blood pressure <100 mm Hg, heart rate <60 bpm, and age >80 years (exclusion criteria employed in those trials). The BEST comparison subgroup (BCG) was further modified to more closely reflect the racial demographics reported for patients enrolled in CIBIS-II and MERIT-HF. The association of beta-blocker therapy with overall survival and survival free of cardiac death, sudden cardiac death, and progressive pump failure in the BCG was assessed. In the BCG subgroup, bucindolol treatment was associated with significantly lower risk of death from all causes (hazard ratio (HR)=0.77 [95% CI=0.65, 0.92]), cardiovascular death (HR=0.71 [0.58, 0.86]), sudden death (HR=0.77 [0.59, 0.999]), and pump failure death (HR=0.64 [0.45, 0.91]). Although not excluding the possibility of differences resulting from chance alone or to different properties among beta-blockers, this study suggests the possibility that different heart failure population subgroups may have different responses to beta-blocker therapy.

  11. The Deviant University Student: Historical Discourses about Student Failure and "Wastage" in the Antipodes

    ERIC Educational Resources Information Center

    Manathunga, Catherine

    2014-01-01

    The emergence of academic development in Anglophone higher education was linked to post Second World War massification and concerns about student failure. These concerns were driven by increasing statistical investigations into student attrition and degree times to completion, particularly in Australia and Aotearoa, New Zealand. There was a…

  12. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Astrophysics Data System (ADS)

    Hendricks, R. C.; McDonald, G.

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  13. Assessment of variations in thermal cycle life data of thermal barrier coated rods

    NASA Technical Reports Server (NTRS)

    Hendricks, R. C.; Mcdonald, G.

    1981-01-01

    An analysis of thermal cycle life data for 22 thermal barrier coated (TBC) specimens was conducted. The Zr02-8Y203/NiCrAlY plasma spray coated Rene 41 rods were tested in a Mach 0.3 Jet A/air burner flame. All specimens were subjected to the same coating and subsequent test procedures in an effort to control three parametric groups; material properties, geometry and heat flux. Statistically, the data sample space had a mean of 1330 cycles with a standard deviation of 520 cycles. The data were described by normal or log-normal distributions, but other models could also apply; the sample size must be increased to clearly delineate a statistical failure model. The statistical methods were also applied to adhesive/cohesive strength data for 20 TBC discs of the same composition, with similar results. The sample space had a mean of 9 MPa with a standard deviation of 4.2 MPa.

  14. Efficiency and Safety of Prolonged Levosimendan Infusion in Patients with Acute Heart Failure

    PubMed Central

    Aidonidis, Georgios; Kanonidis, Ioannis; Koutsimanis, Vasileios; Neumann, Till; Erbel, Raimund; Sakadamis, Georgios

    2011-01-01

    Background. Levosimendan is an inotropic drug with unique pharmacological advantages in patients with acute heart failure. Scope of this study is to determine whether longer infusion patterns without the hypotension-inducing loading dose could justify an effective and safe alternative approach. Methods. 70 patients admitted to the emergencies with decompensated chronic heart failure received intravenously levosimendan without a loading dose up to 72 hours. Clinical parameters, BNP (Brain Natriuretic Peptide) and signal-averaged-ECG data (SAECG) were recorded up to 72 hours. Results. The 48-hour group demonstrated a statistically significant BNP decrease (P < .001) after 48 hours, which also maintained after 72 hours. The 72-hour group demonstrated a bordeline decrease of BNP after 48 hours (P = .039), necessitating an additional 24-hour infusion to achieve significant reduction after 72 hours (P < .004). SAECG data demonstrated a statistically significant decrease after 72 hours (P < .04). Apart from two deaths due to advanced heart failure, no major complications were observed. Conclusion. Prolonged infusion of levosimendan without a loading dose is associated with an acceptable clinical and neurohumoral response. PMID:21559263

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaffer, Richard, E-mail: rickyshaffer@yahoo.co.u; Department of Clinical Oncology, Imperial College London National Health Service Trust, London; Pickles, Tom

    Purpose: Prior studies have derived low values of alpha-beta ratio (a/ss) for prostate cancer of approximately 1-2 Gy. These studies used poorly matched groups, differing definitions of biochemical failure, and insufficient follow-up. Methods and Materials: National Comprehensive Cancer Network low- or low-intermediate risk prostate cancer patients, treated with external beam radiotherapy or permanent prostate brachytherapy, were matched for prostate-specific antigen, Gleason score, T-stage, percentage of positive cores, androgen deprivation therapy, and era, yielding 118 patient pairs. The Phoenix definition of biochemical failure was used. The best-fitting value for a/ss was found for up to 90-month follow-up using maximum likelihood analysis,more » and the 95% confidence interval using the profile likelihood method. Linear quadratic formalism was applied with the radiobiological parameters of relative biological effectiveness = 1.0, potential doubling time = 45 days, and repair half-time = 1 hour. Bootstrap analysis was performed to estimate uncertainties in outcomes, and hence in a/ss. Sensitivity analysis was performed by varying the values of the radiobiological parameters to extreme values. Results: The value of a/ss best fitting the outcomes data was >30 Gy, with lower 95% confidence limit of 5.2 Gy. This was confirmed on bootstrap analysis. Varying parameters to extreme values still yielded best-fit a/ss of >30 Gy, although the lower 95% confidence interval limit was reduced to 0.6 Gy. Conclusions: Using carefully matched groups, long follow-up, the Phoenix definition of biochemical failure, and well-established statistical methods, the best estimate of a/ss for low and low-tier intermediate-risk prostate cancer is likely to be higher than that of normal tissues, although a low value cannot be excluded.« less

  16. SU-E-T-117: Analysis of the ArcCHECK Dosimetry Gamma Failure Using the 3DVH System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, S; Choi, W; Lee, H

    2015-06-15

    Purpose: To evaluate gamma analysis failure for the VMAT patient specific QA using ArcCHECK cylindrical phantom. The 3DVH system(Sun Nuclear, FL) was used to analyze the dose difference statistic between measured dose and treatment planning system calculated dose. Methods: Four case of gamma analysis failure were selected retrospectively. Our institution gamma analysis indexes were absolute dose, 3%/3mm and 90%pass rate in the ArcCHECK dosimetry. The collapsed cone convolution superposition (CCCS) dose calculation algorithm for VMAT was used. Dose delivery was performed with Elekta Agility. The A1SL(standard imaging, WI) and cavity plug were used for point dose measurement. Delivery QA plansmore » and images were used for 3DVH Reference data instead of patient plan and image. The measured data of ‘.txt’ file was used for comparison at diodes to acquire a global dose level. The,.acml’ file was used for AC-PDP and to calculated point dose. Results: The global dose of 3DVH was calculated as 1.10 Gy, 1.13, 1.01 and 0.2 Gy respectively. The global dose of 0.2 Gy case was induced by distance discrepancy. The TPS calculated point dose of was 2.33 Gy to 2.77 Gy and 3DVH calculated dose was 2.33 Gy to 2.68 Gy. The maximum dose differences were −2.83% and −3.1% for TPS vs. measured dose and TPS vs. 3DVH calculated respectively in the same case. The difference between measured and 3DVH was 0.1% in that case. The 3DVH gamma pass rate was 98% to 99.7%. Conclusion: We found the TPS calculation error by 3DVH calculation using ArcCHECK measured dose. It seemed that our CCCS algorithm RTP system over estimated at the central region and underestimated scattering at the peripheral diode detector point. The relative gamma analysis and point dose measurement would be recommended for VMAT DQA in the gamma failure case of ArcCHECK dosimetry.« less

  17. Quality compensation programs: are they worth all the hype? A comparison of outcomes within a Medicare advantage heart failure population.

    PubMed

    Esse, Tara; Serna, Omar; Chitnis, Abhishek; Johnson, Michael; Fernandez, Nelson

    2013-05-01

    Quality compensation programs (QCPs), also known as pay-for-performance programs, are becoming more common within managed care entities. QCPs are believed to yield better patient outcomes, yet the programs lack the evidence needed to support these claims. We evaluated a QCP offered to network primary care physicians (PCPs) within a Medicare managed care plan to determine if a positive correlation between outcomes and the program exists. To compare outcomes of heart failure members under the care of PCPs enrolled in a Medicare Advantage Prescription Drug (MAPD) Plan QCP with those who are not affiliated with a QCP. Retrospective analysis was conducted on the heart failure population of a MAPD in Texas. Heart failure members were identified using ICD-9-CM codes from inpatient and outpatient claims for 2010. These members must have been continuously eligible all 12 months of the year to be included in the analysis. The primary intervention was enrollment by the member's PCP into the QCP. Measurable outcomes included acute (hospital) admits, emergency room (ER) visits, appropriate laboratory tests, and prescriptions of medications that are evidence based and guideline driven. Centers for Medicare and Medicaid Services (CMS) risk scores and comorbidities were used to risk-adjust outcomes. A total of 4,240 members was included in the analysis. From that population, 1,225 members (28.8%) were followed by PCPs enrolled in a QCP; 3,015 members (71.1%) were followed by PCPs not enrolled in a QCP. The adjusted analysis showed that none of the drug comparisons statistically differed between the QCP and non-QCP groups, whereas all of the lab tests, including low-density lipoprotein cholesterol (LDL-C), hemoglobin A1c, creatinine, and microalbumin, as well as the acquisition of the flu vaccine, occurred more frequently in the QCP group. Acute admits and ER visits in the QCP and non-QCP groups were similar before and after adjustment. The QCP group was significantly older with a statistically significant higher prevalence of renal failure and higher CMS risk scores. After evaluation of our QCP's impact on the quality of care provided to our Medicare beneficiaries, we have concluded that there is potential for health care improvement through pay-for-performance programs. We have observed in our MAPD heart failure population, enrolled in a QCP during the year of 2010, an increase in age and CMS risk scores, a decline in renal function, and noted the group to have a more female presence. Yet, the outcomes of this group (hospitalizations, ER visits, acquisition of lab tests, etc.) were similar when compared with younger, healthier members not enrolled in a QCP. We feel the clinical relevance of the data indicates that, overall, the quality of care is somewhat improved for QCP-enrolled providers when compared with non-QCP providers in regards to achieving certain quality metrics. (i.e., immunizations, HgA1c, LDL-C, etc.) Further research is definitely needed to determine if health care costs and clinical outcomes, in the long term, are improved for members enrolled in these QCP programs, as well as their impact upon a health plan's Medicare Star rating.

  18. Behavioral pattern identification for structural health monitoring in complex systems

    NASA Astrophysics Data System (ADS)

    Gupta, Shalabh

    Estimation of structural damage and quantification of structural integrity are critical for safe and reliable operation of human-engineered complex systems, such as electromechanical, thermofluid, and petrochemical systems. Damage due to fatigue crack is one of the most commonly encountered sources of structural degradation in mechanical systems. Early detection of fatigue damage is essential because the resulting structural degradation could potentially cause catastrophic failures, leading to loss of expensive equipment and human life. Therefore, for reliable operation and enhanced availability, it is necessary to develop capabilities for prognosis and estimation of impending failures, such as the onset of wide-spread fatigue crack damage in mechanical structures. This dissertation presents information-based online sensing of fatigue damage using the analytical tools of symbolic time series analysis ( STSA). Anomaly detection using STSA is a pattern recognition method that has been recently developed based upon a fixed-structure, fixed-order Markov chain. The analysis procedure is built upon the principles of Symbolic Dynamics, Information Theory and Statistical Pattern Recognition. The dissertation demonstrates real-time fatigue damage monitoring based on time series data of ultrasonic signals. Statistical pattern changes are measured using STSA to monitor the evolution of fatigue damage. Real-time anomaly detection is presented as a solution to the forward (analysis) problem and the inverse (synthesis) problem. (1) the forward problem - The primary objective of the forward problem is identification of the statistical changes in the time series data of ultrasonic signals due to gradual evolution of fatigue damage. (2) the inverse problem - The objective of the inverse problem is to infer the anomalies from the observed time series data in real time based on the statistical information generated during the forward problem. A computer-controlled special-purpose fatigue test apparatus, equipped with multiple sensing devices (e.g., ultrasonics and optical microscope) for damage analysis, has been used to experimentally validate the STSA method for early detection of anomalous behavior. The sensor information is integrated with a software module consisting of the STSA algorithm for real-time monitoring of fatigue damage. Experiments have been conducted under different loading conditions on specimens constructed from the ductile aluminium alloy 7075 - T6. The dissertation has also investigated the application of the STSA method for early detection of anomalies in other engineering disciplines. Two primary applications include combustion instability in a generic thermal pulse combustor model and whirling phenomenon in a typical misaligned shaft.

  19. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  20. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  1. A statistical test to show negligible trend

    Treesearch

    Philip M. Dixon; Joseph H.K. Pechmann

    2005-01-01

    The usual statistical tests of trend are inappropriate for demonstrating the absence of trend. This is because failure to reject the null hypothesis of no trend does not prove that null hypothesis. The appropriate statistical method is based on an equivalence test. The null hypothesis is that the trend is not zero, i.e., outside an a priori specified equivalence region...

  2. Exponential order statistic models of software reliability growth

    NASA Technical Reports Server (NTRS)

    Miller, D. R.

    1985-01-01

    Failure times of a software reliabilty growth process are modeled as order statistics of independent, nonidentically distributed exponential random variables. The Jelinsky-Moranda, Goel-Okumoto, Littlewood, Musa-Okumoto Logarithmic, and Power Law models are all special cases of Exponential Order Statistic Models, but there are many additional examples also. Various characterizations, properties and examples of this class of models are developed and presented.

  3. Fault detection and diagnosis using neural network approaches

    NASA Technical Reports Server (NTRS)

    Kramer, Mark A.

    1992-01-01

    Neural networks can be used to detect and identify abnormalities in real-time process data. Two basic approaches can be used, the first based on training networks using data representing both normal and abnormal modes of process behavior, and the second based on statistical characterization of the normal mode only. Given data representative of process faults, radial basis function networks can effectively identify failures. This approach is often limited by the lack of fault data, but can be facilitated by process simulation. The second approach employs elliptical and radial basis function neural networks and other models to learn the statistical distributions of process observables under normal conditions. Analytical models of failure modes can then be applied in combination with the neural network models to identify faults. Special methods can be applied to compensate for sensor failures, to produce real-time estimation of missing or failed sensors based on the correlations codified in the neural network.

  4. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Technical Reports Server (NTRS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-01-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  5. Interconnect fatigue design for terrestrial photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Mon, G. R.; Moore, D. M.; Ross, R. G., Jr.

    1982-03-01

    The results of comprehensive investigation of interconnect fatigue that has led to the definition of useful reliability-design and life-prediction algorithms are presented. Experimental data indicate that the classical strain-cycle (fatigue) curve for the interconnect material is a good model of mean interconnect fatigue performance, but it fails to account for the broad statistical scatter, which is critical to reliability prediction. To fill this shortcoming the classical fatigue curve is combined with experimental cumulative interconnect failure rate data to yield statistical fatigue curves (having failure probability as a parameter) which enable (1) the prediction of cumulative interconnect failures during the design life of an array field, and (2) the unambiguous--ie., quantitative--interpretation of data from field-service qualification (accelerated thermal cycling) tests. Optimal interconnect cost-reliability design algorithms are derived based on minimizing the cost of energy over the design life of the array field.

  6. Statistical Mechanics Model of Solids with Defects

    NASA Astrophysics Data System (ADS)

    Kaufman, M.; Walters, P. A.; Ferrante, J.

    1997-03-01

    Previously(M.Kaufman, J.Ferrante,NASA Tech. Memor.,1996), we examined the phase diagram for the failure of a solid under isotropic expansion and compression as a function of stress and temperature with the "springs" modelled by the universal binding energy relation (UBER)(J.H.Rose, J.R.Smith, F.Guinea, J.Ferrante, Phys.Rev.B29, 2963 (1984)). In the previous calculation we assumed that the "springs" failed independently and that the strain is uniform. In the present work, we have extended this statistical model of mechanical failure by allowing for correlations between "springs" and for thermal fluctuations in strains. The springs are now modelled in the harmonic approximation with a failure threshold energy E0, as an intermediate step in future studies to reinclude the full non-linear dependence of the UBER for modelling the interactions. We use the Migdal-Kadanoff renormalization-group method to determine the phase diagram of the model and to compute the free energy.

  7. Quantum error-correction failure distributions: Comparison of coherent and stochastic error models

    NASA Astrophysics Data System (ADS)

    Barnes, Jeff P.; Trout, Colin J.; Lucarelli, Dennis; Clader, B. D.

    2017-06-01

    We compare failure distributions of quantum error correction circuits for stochastic errors and coherent errors. We utilize a fully coherent simulation of a fault-tolerant quantum error correcting circuit for a d =3 Steane and surface code. We find that the output distributions are markedly different for the two error models, showing that no simple mapping between the two error models exists. Coherent errors create very broad and heavy-tailed failure distributions. This suggests that they are susceptible to outlier events and that mean statistics, such as pseudothreshold estimates, may not provide the key figure of merit. This provides further statistical insight into why coherent errors can be so harmful for quantum error correction. These output probability distributions may also provide a useful metric that can be utilized when optimizing quantum error correcting codes and decoding procedures for purely coherent errors.

  8. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  9. Sudden Death in Heart Failure With Preserved Ejection Fraction: A Competing Risks Analysis From the TOPCAT Trial.

    PubMed

    Vaduganathan, Muthiah; Claggett, Brian L; Chatterjee, Neal A; Anand, Inder S; Sweitzer, Nancy K; Fang, James C; O'Meara, Eileen; Shah, Sanjiv J; Hegde, Sheila M; Desai, Akshay S; Lewis, Eldrin F; Rouleau, Jean; Pitt, Bertram; Pfeffer, Marc A; Solomon, Scott D

    2018-03-04

    This study investigated the rates and predictors of SD or aborted cardiac arrest (ACA) in HFpEF. Sudden death (SD) may be an important mode of death in heart failure with preserved ejection fraction (HFpEF). We studied 1,767 patients with HFpEF (EF ≥45%) enrolled in the Americas region of the TOPCAT (Aldosterone Antagonist Therapy for Adults With Heart Failure and Preserved Systolic Function) trial. We identified independent predictors of composite SD/ACA with stepwise backward selection using competing risks regression analysis that accounted for nonsudden causes of death. During a median 3.0-year (25 th to 75 th percentile: 1.9 to 4.4 years) follow-up, 77 patients experienced SD/ACA, and 312 experienced non-SD/ACA. Corresponding incidence rates were 1.4 events/100 patient-years (25 th to 75 th percentile: 1.1 to 1.8 events/100 patient-years) and 5.8 events/100 patient-years (25 th to 75 th percentile: 5.1 to 6.4 events/100 patient-years). SD/ACA was numerically lower but not statistically reduced in those randomized to spironolactone: 1.2 events/100 patient-years (25 th to 75 th percentile: 0.9 to 1.7 events/100 patient-years) versus 1.6 events/100 patient-years (25 th to 75 th percentile: 1.2 to 2.2 events/100 patient-years); the subdistributional hazard ratio was 0.74 (95% confidence interval: 0.47 to 1.16; p = 0.19). After accounting for competing risks of non-SD/ACA, male sex and insulin-treated diabetes mellitus were independently predictive of composite SD/ACA (C-statistic = 0.65). Covariates, including eligibility criteria, age, ejection fraction, coronary artery disease, left bundle branch block, and baseline therapies, were not independently associated with SD/ACA. Sex and diabetes mellitus status remained independent predictors in sensitivity analyses, excluding patients with implantable cardioverter-defibrillators and when predicting SD alone. SD accounted for ∼20% of deaths in HFpEF. Male sex and insulin-treated diabetes mellitus identified patients at higher risk for SD/ACA with modest discrimination. These data might guide future SD preventative efforts in HFpEF. (Aldosterone Antagonist Therapy for Adults With Heart Failure and Preserved Systolic Function [TOPCAT]); NCT00094302. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  10. Effect of fiber post length and abutment height on fracture resistance of endodontically treated premolars prepared for zirconia crowns.

    PubMed

    Lin, Jie; Matinlinna, Jukka Pekka; Shinya, Akikazu; Botelho, Michael George; Zheng, Zhiqiang

    2018-04-01

    The purpose of this study was to compare the fracture resistance, mode of fracture, and stress distribution of endodontically treated teeth prepared with three different fiber post lengths and two different abutment heights, using both experimental and finite element (FE) approaches. Forty-eight human maxillary premolars with two roots were selected and endodontically treated. The teeth were randomly distributed into six equally sized groups (n = 8) with different combinations of post lengths (7.5, 11, and 15 mm) and abutment heights (3 and 5 mm). All the teeth restored with glass fiber post (Rely X Fiber Post, 3M ESPE, USA) and a full zirconia crown. All the specimens were thermocycled and then loaded to failure at an oblique angle of 135°. Statistical analysis was performed for the effects of post length and abutment height on failure loads using ANOVA and Tukey's honestly significant difference test. In addition, corresponding FE models of a premolar restored with a glass fiber post were developed to examine mechanical responses. The factor of post length (P < 0.01) had a significant effect on failure load. The abutment height (P > 0.05) did not have a significant effect on failure load. The highest mean fracture resistance was recorded for the 15 mm post length and 5 mm abutment height test group, which was significantly more resistant to fracture than the 7.5 mm post and 5 mm abutment height group (P < 0.05). The FE analysis showed the peak compression and tension stress values of 7.5 mm post length were higher than that of 11 and 15 mm post length. The stress value of remaining tooth decreased as the post length was increased. Within the limitations of this experimental and FE analysis study, increasing the post length inside the root of endodontically treated premolar teeth restored with glass-fiber posts increase the fracture resistance to non-axial forces. Failure mode is more favorable with reduced abutment heights.

  11. Cycles till failure of silver-zinc cells with competing failure modes - Preliminary data analysis

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.; Leibecki, H. F.; Bozek, J. M.

    1980-01-01

    The data analysis of cycles to failure of silver-zinc electrochemical cells with competing failure modes is presented. The test ran 129 cells through charge-discharge cycles until failure; preliminary data analysis consisted of response surface estimate of life. Batteries fail through low voltage condition and an internal shorting condition; a competing failure modes analysis was made using maximum likelihood estimation for the extreme value life distribution. Extensive residual plotting and probability plotting were used to verify data quality and selection of model.

  12. Predicting short-term mortality in advanced decompensated heart failure - role of the updated acute decompensated heart failure/N-terminal pro-B-type natriuretic Peptide risk score.

    PubMed

    Scrutinio, Domenico; Ammirati, Enrico; Passantino, Andrea; Guida, Pietro; D'Angelo, Luciana; Oliva, Fabrizio; Ciccone, Marco Matteo; Iacoviello, Massimo; Dentamaro, Ilaria; Santoro, Daniela; Lagioia, Rocco; Sarzi Braga, Simona; Guzzetti, Daniela; Frigerio, Maria

    2015-01-01

    The first few months after admission are the most vulnerable period in patients with acute decompensated heart failure (ADHF). We assessed the association of the updated ADHF/N-terminal pro-B-type natriuretic peptide (NT-proBNP) risk score with 90-day and in-hospital mortality in 701 patients admitted with advanced ADHF, defined as severe symptoms of worsening HF, severely depressed left ventricular ejection fraction, and the need for i.v. diuretic and/or inotropic drugs. A total of 15.7% of the patients died within 90 days of admission and 5.2% underwent ventricular assist device (VAD) implantation or urgent heart transplantation (UHT). The C-statistic of the ADHF/NT-proBNP risk score for 90-day mortality was 0.810 (95% CI: 0.769-0.852). Predicted and observed mortality rates were in close agreement. When the composite outcome of death/VAD/UHT at 90 days was considered, the C-statistic decreased to 0.741. During hospitalization, 7.6% of the patients died. The C-statistic for in-hospital mortality was 0.815 (95% CI: 0.761-0.868) and Hosmer-Lemeshow χ(2)=3.71 (P=0.716). The updated ADHF/NT-proBNP risk score outperformed the Acute Decompensated Heart Failure National Registry, the Organized Program to Initiate Lifesaving Treatment in Patients Hospitalized for Heart Failure, and the American Heart Association Get with the Guidelines Program predictive models. Updated ADHF/NT-proBNP risk score is a valuable tool for predicting short-term mortality in severe ADHF, outperforming existing inpatient predictive models.

  13. Reasons for revision of failed hemiarthroplasty: Are there any differences between unipolar and bipolar?

    PubMed

    Iamthanaporn, Khanin; Chareancholvanich, Keerati; Pornrattanamaneewong, Chaturong

    2018-03-16

    Hemiarthroplasty (HA) is an effective procedure for treatment of femoral neck fracture. However, it is debatable whether unipolar or bipolar HA is the most suitable implant. The purpose of this study was to compare the causes of failure and longevity in both types of HA. We retrospectively reviewed 133 cases that underwent revision surgery of HA between 2002 and 2012. The causes of revision surgery were identified and stratified into early (≤ 5 years) failure and late (> 5 years) failure. Survival analyses were performed for each implant type. The common causes for revision were aseptic loosening (49.6%), infection (22.6%) and acetabular erosion (15.0%). Unipolar and bipolar HA were not different in causes for revision, but the unipolar group had a statistically significantly higher number of acetabular erosion events compared with the bipolar group (p = 0.002). In the early period, 24 unipolar HA (52.9%) and 28 bipolar HA (34.1%) failed. There were no statistically significant differences in the numbers of revised HA in each period between the two groups (p = 0.138). The median survival times in the unipolar and bipolar groups were 84.0 ± 24.5 and 120.0 ± 5.5 months, respectively. However, the survival times of both implants were not statistically significantly different. Aseptic loosening was the most common reason for revision surgery after hemiarthroplasty surgery in early and late failures. Unipolar and bipolar hemiarthroplasty were not different in terms of causes of failure and survivorship except bipolar hemiarthroplasty had many fewer acetabular erosion events.

  14. Effects of unplanned treatment interruptions on HIV treatment failure - results from TAHOD.

    PubMed

    Jiamsakul, Awachana; Kerr, Stephen J; Ng, Oon Tek; Lee, Man Po; Chaiwarith, Romanee; Yunihastuti, Evy; Van Nguyen, Kinh; Pham, Thuy Thanh; Kiertiburanakul, Sasisopin; Ditangco, Rossana; Saphonn, Vonthanak; Sim, Benedict L H; Merati, Tuti Parwati; Wong, Wingwai; Kantipong, Pacharee; Zhang, Fujie; Choi, Jun Yong; Pujari, Sanjay; Kamarulzaman, Adeeba; Oka, Shinichi; Mustafa, Mahiran; Ratanasuwan, Winai; Petersen, Boondarika; Law, Matthew; Kumarasamy, Nagalingeswaran

    2016-05-01

    Treatment interruptions (TIs) of combination antiretroviral therapy (cART) are known to lead to unfavourable treatment outcomes but do still occur in resource-limited settings. We investigated the effects of TI associated with adverse events (AEs) and non-AE-related reasons, including their durations, on treatment failure after cART resumption in HIV-infected individuals in Asia. Patients initiating cART between 2006 and 2013 were included. TI was defined as stopping cART for >1 day. Treatment failure was defined as confirmed virological, immunological or clinical failure. Time to treatment failure during cART was analysed using Cox regression, not including periods off treatment. Covariables with P < 0.10 in univariable analyses were included in multivariable analyses, where P < 0.05 was considered statistically significant. Of 4549 patients from 13 countries in Asia, 3176 (69.8%) were male and the median age was 34 years. A total of 111 (2.4%) had TIs due to AEs and 135 (3.0%) had TIs for other reasons. Median interruption times were 22 days for AE and 148 days for non-AE TIs. In multivariable analyses, interruptions >30 days were associated with failure (31-180 days HR = 2.66, 95%CI (1.70-4.16); 181-365 days HR = 6.22, 95%CI (3.26-11.86); and >365 days HR = 9.10, 95% CI (4.27-19.38), all P < 0.001, compared to 0-14 days). Reasons for previous TI were not statistically significant (P = 0.158). Duration of interruptions of more than 30 days was the key factor associated with large increases in subsequent risk of treatment failure. If TI is unavoidable, its duration should be minimised to reduce the risk of failure after treatment resumption. © 2016 John Wiley & Sons Ltd.

  15. A survey of design methods for failure detection in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1975-01-01

    A number of methods for the detection of abrupt changes (such as failures) in stochastic dynamical systems were surveyed. The class of linear systems were emphasized, but the basic concepts, if not the detailed analyses, carry over to other classes of systems. The methods surveyed range from the design of specific failure-sensitive filters, to the use of statistical tests on filter innovations, to the development of jump process formulations. Tradeoffs in complexity versus performance are discussed.

  16. Computer Science and Statistics. Proceedings of the Symposium on the Interface (18th) Held on March 19-21, 1986 in Fort Collins, Colorado.

    DTIC Science & Technology

    1987-08-26

    example, expert systems research would benefit examples are the Acute Renal Failure [15] system, the if it could attract statisticians to assist in...research projects including the Acute Renal Failure [15] system, the 6. EXPLAINING COMPLEX REASONING INTERNIST-] [22] system for diagnosis within the...the MEDAS and Acute Renal Failure systems. task at any point in reasoning about a case is constrained to Entropy-discriminate makes use of a measure

  17. Obesity-related decrease in intraoperative blood flow is associated with maturation failure of radiocephalic arteriovenous fistula.

    PubMed

    Kim, Jwa-Kyung; Jeong, Jae Han; Song, Young Rim; Kim, Hyung Jik; Lee, Won Yong; Kim, Kun Il; Kim, Sung Gyun

    2015-10-01

    Successful arteriovenous fistula (AVF) maturation is often challenging in obese patients. Optimal initial intraoperative blood flow (IOBF) is essential for adequate AVF maturation. This study was conducted to elucidate the effect of obesity on IOBF and radiocephalic AVF maturation. Patients with a newly created radiocephalic AVF were included (N = 252). Obesity was defined as a baseline body mass index (BMI) ≥25 kg/m(2), and primary maturation failure was defined as failure to use the AVF successfully by 3 months after its creation. IOBF was measured immediately after construction of the AVF with a VeriQ system (MediStim, Oslo, Norway). The mean BMI was 24.1 ± 3.9 kg/m(2), and the prevalence of obesity was 31.3%. Particularly, 8.3% (21 patients) had a BMI ≥30 kg/m(2). Primary maturation failure occurred in 100 patients (39.7%), and an IOBF <190 mL/min was closely associated with the risk of maturation failure (relative risk, 3.05; 95% confidence interval, 1.52-6.11). Compared with nonobese patients, obese subjects had a significantly higher prevalence of diabetes and elevated high-sensitivity C-reactive protein levels, whereas diameters of vessels were similar. When the patients were further divided into three groups as BMI <25, 25 to 29.9, and ≥30 kg/m(2), patients in the higher BMI group showed significantly lower IOBF and higher maturation failure rate. According to multivariate analysis, the statistically significant variables that determined maturation failure were obesity, previous vascular disease, increased high-sensitivity C-reactive protein levels, and IOBF <190 mL/min. Obese patients had a significantly lower IOBF, and both obesity and low IOBF contributed to the primary maturation failure of AVF. Obesity-associated inflammation and atherosclerosis might play roles in this association. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  18. Hyporesponsiveness to Darbepoetin Alfa in Patients With Heart Failure and Anemia in the RED-HF Study (Reduction of Events by Darbepoetin Alfa in Heart Failure): Clinical and Prognostic Associations.

    PubMed

    van der Meer, Peter; Grote Beverborg, Niels; Pfeffer, Marc A; Olson, Kurt; Anand, Inder S; Westenbrink, B Daan; McMurray, John J V; Swedberg, Karl; Young, James B; Solomon, Scott D; van Veldhuisen, Dirk J

    2018-02-01

    A poor response to erythropoiesis-stimulating agents such as darbepoetin alfa has been associated with adverse outcomes in patients with diabetes mellitus, chronic kidney disease, and anemia; whether this is also true in heart failure is unclear. We performed a post hoc analysis of the RED-HF trial (Reduction of Events by Darbepoetin Alfa in Heart Failure), in which 1008 patients with systolic heart failure and anemia (hemoglobin level, 9.0-12.0 g/dL) were randomized to darbepoetin alfa. We examined the relationship between the hematopoietic response to darbepoetin alfa and the incidence of all-cause death or first heart failure hospitalization during a follow-up of 28 months. For the purposes of the present study, patients in the lowest quartile of hemoglobin change after 4 weeks were considered nonresponders. The median initial hemoglobin change in nonresponders (n=252) was -0.25 g/dL and +1.00 g/dL in the remainder of patients (n=756). Worse renal function, lower sodium levels, and less use of angiotensin-converting enzyme inhibitors or angiotensin receptor blockers were independently associated with nonresponse. Although a low endogenous erythropoietin level helped to differentiate responders from nonresponders, its predictive value in a multivariable model was poor (C statistic=0.69). Nonresponders had a higher rate of all-cause death or first heart failure hospitalization (hazard ratio, 1.25; 95% confidence interval, 1.02-1.54) and a higher risk of all-cause mortality (hazard ratio, 1.30; 95% confidence interval, 1.04-1.63) than responders. A poor response to darbepoetin alfa was associated with worse outcomes in heart failure patients with anemia. Patients with a poor response were difficult to identify using clinical and biochemical biomarkers. URL: https://www.clinicaltrials.gov. Unique identifier: NCT00358215. © 2018 American Heart Association, Inc.

  19. Effect of artificial aging and surface treatment on bond strengths to dental zirconia.

    PubMed

    Perdigão, J; Fernandes, S D; Pinto, A M; Oliveira, F A

    2013-01-01

    The objective of this project was to study the influence of artificial aging and surface treatment on the microtensile bond strengths (μTBS) between zirconia and a phosphate monomer-based self-adhesive cement. Thirty zirconia disks (IPS e.max ZirCAD, Ivoclar Vivadent) were randomly assigned to two aging regimens: AR, used as received, which served as a control, and AG, artificial aging to simulate low-temperature degradation. Subsequently, the disks of each aging regimen were assigned to three surface treatments: NT, no surface treatment; CO, surface silicatization with CoJet sand (3M ESPE); and ZP, zirconia surface treated with Z-Prime Plus (Bisco Inc). Thirty discs were made of Filtek Z250 (3M ESPE) composite resin and luted to the zirconia discs using RelyX Unicem (3M ESPE). The specimens were sectioned with a diamond blade in X and Y directions to obtain bonded beams with a cross-section of 1.0 ± 0.2 mm. The beams were tested in tensile mode in a universal testing machine at a speed of 0.5 mm/min to measure μTBS. Selected beams were selected for fractographic analysis under the SEM. Statistical analysis was carried out with two-way analysis of variance and Dunnett T3 post hoc test at a significance level of 95%. The mean μTBS for the three AR subgroups (AR-NT, AR-CO, and AR-ZP) were significantly higher than those of the corresponding AG groups (p<0.0001). Both AR-CO and AR-ZP resulted in statistically significant higher mean bond strengths than the group AR-NT (p<0.006 and p<0.0001, respectively). Both AG-CO and AG-ZP resulted in statistically significant higher mean bond strengths than the group AG-NT (both at p<0.0001). Overall, AG decreased mean μTBS. Under the SEM, mixed failures showed residual cement attached to the zirconia side of the beams. CO resulted in a characteristic roughness of the zirconia surface. AR-ZP was the only group for which the amount of residual cement occupied at least 50% of the interface in mixed failures.

  20. Assessment of the knowledge and attitudes of intern doctors to medication prescribing errors in a Nigeria tertiary hospital

    PubMed Central

    Ajemigbitse, Adetutu A.; Omole, Moses Kayode; Ezike, Nnamdi Chika; Erhun, Wilson O.

    2013-01-01

    Context: Junior doctors are reported to make most of the prescribing errors in the hospital setting. Aims: The aim of the following study is to determine the knowledge intern doctors have about prescribing errors and circumstances contributing to making them. Settings and Design: A structured questionnaire was distributed to intern doctors in National Hospital Abuja Nigeria. Subjects and Methods: Respondents gave information about their experience with prescribing medicines, the extent to which they agreed with the definition of a clinically meaningful prescribing error and events that constituted such. Their experience with prescribing certain categories of medicines was also sought. Statistical Analysis Used: Data was analyzed with Statistical Package for the Social Sciences (SPSS) software version 17 (SPSS Inc Chicago, Ill, USA). Chi-squared analysis contrasted differences in proportions; P < 0.05 was considered to be statistically significant. Results: The response rate was 90.9% and 27 (90%) had <1 year of prescribing experience. 17 (56.7%) respondents totally agreed with the definition of a clinically meaningful prescribing error. Most common reasons for prescribing mistakes were a failure to check prescriptions with a reference source (14, 25.5%) and failure to check for adverse drug interactions (14, 25.5%). Omitting some essential information such as duration of therapy (13, 20%), patient age (14, 21.5%) and dosage errors (14, 21.5%) were the most common types of prescribing errors made. Respondents considered workload (23, 76.7%), multitasking (19, 63.3%), rushing (18, 60.0%) and tiredness/stress (16, 53.3%) as important factors contributing to prescribing errors. Interns were least confident prescribing antibiotics (12, 25.5%), opioid analgesics (12, 25.5%) cytotoxics (10, 21.3%) and antipsychotics (9, 19.1%) unsupervised. Conclusions: Respondents seemed to have a low awareness of making prescribing errors. Principles of rational prescribing and events that constitute prescribing errors should be taught in the practice setting. PMID:24808682

  1. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  2. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  3. Misinterpretation of statistical distance in security of quantum key distribution shown by simulation

    NASA Astrophysics Data System (ADS)

    Iwakoshi, Takehisa; Hirota, Osamu

    2014-10-01

    This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.

  4. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Efficacy of platelet-rich plasma in arthroscopic repair of full-thickness rotator cuff tears: a meta-analysis.

    PubMed

    Cai, You-zhi; Zhang, Chi; Lin, Xiang-jin

    2015-12-01

    The use of platelet-rich plasma (PRP) is an innovative clinical therapy, especially in arthroscopic rotator cuff repair. The purpose of this study was to compare the clinical improvement and tendon-to-bone healing with and without PRP therapy in arthroscopic rotator cuff repair. A systematic search was done in the major medical databases to evaluate the studies using PRP therapy (PRP+) or with no PRP (PRP-) for the treatment of patients with rotator cuff tears. We reviewed clinical scores such as the Constant score, the American Shoulder and Elbow Surgeons score, the University of California at Los Angeles (UCLA) Shoulder Rating Scale, the Simple Shoulder Test, and the failure-to-heal rate by magnetic resonance imaging between PRP+ and PRP- groups. Five studies included in this review were used for a meta-analysis based on data availability. There were no statistically significant differences between PRP+ and PRP- groups for overall outcome scores (P > .05). However, the PRP+ group exhibited better healing rates postoperatively than the PRP- group (P = .03) in small/moderate full-thickness tears. The use of PRP therapy in full-thickness rotator cuff repairs showed no statistically significant difference compared with no PRP therapy in clinical outcome scores, but the failure-to-heal rate was significantly decreased when PRP was used for treatment of small-to-moderately sized tears. PRP therapy may improve tendon-to-bone healing in patients with small or moderate rotator cuff tears. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  6. Association of chronic disease prevalence and quality of life with suicide-related ideation and suicide attempt among Korean adults

    PubMed Central

    Joshi, Pankaj; Song, Han-Byol; Lee, Sang-Ah

    2017-01-01

    Aims: The aim of this study is to find the association of chronic disease prevalence (CDP) with suicide-related ideation (SI) and suicide attempt (SA) and to determine the combined effect of CDP and quality of life (QoL) with SI or SA. Design: This was a cross-sectional study. Materials and Methods: The data were collected from the nationally representative Korea National Health and Nutrition Examination Survey IV and V (2007–2012). For the analysis, a total of 35,075 adult participants were selected as the final sample, which included 5773 participants with SI and 331 with SA. Statistical Analysis: Multiple logistic regression models were used to examine the odds ratio after adjusting for age, sex, marital status, education, occupation, and household income. Results and Conclusion: SI was positively associated with selected CDP, such as cardiovascular disease (CVD), stroke, ischemic heart disease (IHD), cancer, diabetes, renal failure, and depression, except hypertension. Subjects with CVD, IHD, renal failure, and depression were found likely to have increased odds for SA as compared to non-SA controls. Lower QoL strongly affected SI and SA. Furthermore, the likelihood of SI increased for depressed and cancer subjects who had low QoL in comparison to subjects with high QoL and without chronic disease. Similarly, statistically, significant interaction was observed between lower QoL and depression in relation to SA compared to non-SA controls. These data suggest that suicide-related behavior could be predicted by the prevalence of chronic disease and low QoL. PMID:29085096

  7. Complications of short versus long cephalomedullary nail for intertrochanteric femur fractures, minimum 1 year follow-up.

    PubMed

    Vaughn, Josh; Cohen, Eric; Vopat, Bryan G; Kane, Patrick; Abbood, Emily; Born, Christopher

    2015-05-01

    Hip fractures are becoming increasingly common resulting in significant morbidity, mortality and raising healthcare costs. Both short and long cephalomedullary devices are currently employed to treat intertrochanteric hip fractures. However, which device is optimal continues to be debated as each implant has unique characteristics and theoretical advantages. This study looked to identify rates of complications associated with both long and short cephalomedullary nails for the treatment of intertrochanteric hip fractures. We retrospectively reviewed charts from 2006 to 2011, and we identified 256 patients were identified with AO class 31.1-32.3 fractures. Sixty were treated with short nails and 196 with long nails. Radiographs and charts were then analysed for failures and hardware complications. Catastrophic failure and hardware complication rates were not statistically different between short or long cephalomedullary nails. The overall catastrophic failure rate was 3.1 %; there was a 5 % failure rate in the short-nail group compared with a 2.6 % failure rate in the long-nail group (p = 0.191). There was a 3.33 % secondary femur fracture rate in the short-nail group, compared with none in the long-nail cohort (p = 0.054). The rate of proximal fixation failure was 1.67 % for the short-nail group and 2.0 % in the long-nail group (p = 0.406). Our data suggests equivocal outcomes as measured by similar catastrophic failure rate between both short and long cephalomedullary nails for intertrochanteric femur fractures. However, there was an increased risk of secondary femur fracture with short cephalomedullary nails when compared to long nails that approached statistical significance.

  8. Strength of bone tunnel versus suture anchor and push-lock construct in Broström repair.

    PubMed

    Giza, Eric; Nathe, Ryan; Nathe, Tyler; Anderson, Matthew; Campanelli, Valentina

    2012-06-01

    Operative treatment of mechanical ankle instability is indicated for patients who have had multiple sprains and have continued episodes of instability despite bracing and rehabilitation. Anatomic reconstruction has been shown to have improved outcomes and return to sport as compared with nonanatomic reconstruction. The use of 2 suture anchors and a push-lock anchor is equal to 2 bone tunnels in strength to failure for anatomic Broström repair. Controlled laboratory study. In 7 matched pairs of human cadaver ankles, the calcaneofibular ligament (CFL) and anterior talofibular ligament (ATFL) were incised from their origin on the fibula. A No. 2 Fiberwire suture was placed into the CFL and a separate suture into the ATFL in a running Krackow fashion with a total of 4 locking loops. In 1 ankle of the matched pair, the ligaments were repaired to their anatomic insertion with bone tunnels. In the other, 2 suture anchors were used to reattach the ligaments to their anatomic origins, and a push-lock was used proximally to reinforce these suture anchors. The ligaments were cyclically loaded 20 times and then tested to failure. Torque to failure, degrees to failure, and stiffness were measured. The authors performed a matched pair analysis. An a priori power analysis of 0.8 demonstrated 6 pairs were needed to show a difference of 30% with a 15% standard error at a significance level of .05. There was no difference in the degrees to failure, torque to failure, and stiffness. A post hoc power analysis of torque to failure showed a power of .89 with 7 samples. Power for initial stiffness was .97 with 7 samples. Eleven of 14 specimens failed at either the suture anchor or the bone tunnel. There is no statistical difference in strength or stiffness for a suture anchor and push-lock construct as compared with a bone tunnel construct for an anatomic repair of the lateral ligaments of the ankle. The use of suture anchors in lateral ligament stabilization allows for a smaller incision, less surgical dissection, and improved surgical efficiency. It is up to the discretion of the performing surgeon based on preference, ease of use, operative time, and cost profile to choose either of these constructs for anatomic repair of the lateral ligaments of the ankle. The suture repair at the ligament was significantly strong enough such that the majority of ankles failed at the bone interface.

  9. Vaccine stability study design and analysis to support product licensure.

    PubMed

    Schofield, Timothy L

    2009-11-01

    Stability evaluation supporting vaccine licensure includes studies of bulk intermediates as well as final container product. Long-term and accelerated studies are performed to support shelf life and to determine release limits for the vaccine. Vaccine shelf life is best determined utilizing a formal statistical evaluation outlined in the ICH guidelines, while minimum release is calculated to help assure adequate potency through handling and storage of the vaccine. In addition to supporting release potency determination, accelerated stability studies may be used to support a strategy to recalculate product expiry after an unintended temperature excursion such as a cold storage unit failure or mishandling during transport. Appropriate statistical evaluation of vaccine stability data promotes strategic stability study design, in order to reduce the uncertainty associated with the determination of the degradation rate, and the associated risk to the customer.

  10. Scientific, statistical, practical, and regulatory considerations in design space development.

    PubMed

    Debevec, Veronika; Srčič, Stanko; Horvat, Matej

    2018-03-01

    The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.

  11. Effectiveness and Factors Determining the Success of Management Programs for Patients With Heart Failure: A Systematic Review and Meta-analysis.

    PubMed

    Oyanguren, Juana; Latorre García, Pedro María; Torcal Laguna, Jesús; Lekuona Goya, Iñaki; Rubio Martín, Susana; Maull Lafuente, Elena; Grandes, Gonzalo

    2016-10-01

    Heart failure management programs reduce hospitalizations. Some studies also show reduced mortality. The determinants of program success are unknown. The aim of the present study was to update our understanding of the reductions in mortality and readmissions produced by these programs, elucidate their components, and identify the factors determining program success. Systematic literature review (1990-2014; PubMed, EMBASE, CINAHL, Cochrane Library) and manual search of relevant journals. The studies were selected by 3 independent reviewers. Methodological quality was evaluated in a blinded manner by an external researcher (Jadad scale). These results were pooled using random effects models. Heterogeneity was evaluated with the I 2 statistic, and its explanatory factors were determined using metaregression analysis. Of the 3914 studies identified, 66 randomized controlled clinical trials were selected (18 countries, 13 535 patients). We determined the relative risks to be 0.88 for death (95% confidence interval [95%CI], 0.81-0.96; P < .002; I 2 , 6.1%), 0.92 for all-cause readmissions (95%CI, 0.86-0.98; P < .011; I 2 , 58.7%), and 0.80 for heart failure readmissions (95%CI, 0.71-0.90; P < .0001; I 2 , 52.7%). Factors associated with program success were implementation after 2001, program location outside the United States, greater baseline use of angiotensin-converting enzyme inhibitors/angiotensin receptor blockers, a higher number of intervention team members and components, specialized heart failure cardiologists and nurses, protocol-driven education and its assessment, self-monitoring of signs and symptoms, detection of deterioration, flexible diuretic regimen, early care-seeking among patients and prompt health care response, psychosocial intervention, professional coordination, and program duration. We confirm the reductions in mortality and readmissions with heart failure management programs. Their success is associated with various structural and intervention variables. Copyright © 2016 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  12. Rabies Vaccination: Higher Failure Rates in Imported Dogs than in those Vaccinated in Italy.

    PubMed

    Rota Nodari, E; Alonso, S; Mancin, M; De Nardi, M; Hudson-Cooke, S; Veggiato, C; Cattoli, G; De Benedictis, P

    2017-03-01

    The current European Union (EU) legislation decrees that pets entering the EU from a rabies-infected third country have to obtain a satisfactory virus-neutralizing antibody level, while those moving within the EU require only rabies vaccination as the risk of moving a rabid pet within the EU is considered negligible. A number of factors driving individual variations in dog vaccine response have been previously reported, including a high rate of vaccine failure in puppies, especially those subject to commercial transport. A total of 21 001 observations collected from dogs (2006-2012) vaccinated in compliance with the current EU regulations were statistically analysed to assess the effect of different risk factors related to rabies vaccine efficacy. Within this framework, we were able to compare the vaccination failure rate in a group of dogs entering the Italian border from EU and non-EU countries to those vaccinated in Italy prior to international travel. Our analysis identified that cross-breeds and two breed categories showed high vaccine success rates, while Beagles and Boxers were the least likely to show a successful response to vaccination (88.82% and 90.32%, respectively). Our analysis revealed diverse performances among the commercially available vaccines, in terms of serological peak windows, and marked differences according to geographical area. Of note, we found a higher vaccine failure rate in imported dogs (13.15%) than in those vaccinated in Italy (5.89%). Our findings suggest that the choice of vaccine may influence the likelihood of an animal achieving a protective serological level and that time from vaccination to sampling should be considered when interpreting serological results. A higher vaccine failure in imported compared to Italian dogs highlights the key role that border controls still have in assessing the full compliance of pet movements with EU legislation to minimize the risk of rabies being reintroduced into a disease-free area. © 2016 The Authors. Zoonoses and Public Health Published by Blackwell Verlag GmbH.

  13. Failure Analysis of Discrete Damaged Tailored Extension-Shear-Coupled Stiffened Composite Panels

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.

    2005-01-01

    The results of an analytical and experimental investigation of the failure of composite is tiffener panels with extension-shear coupling are presented. This tailored concept, when used in the cover skins of a tiltrotor aircraft wing has the potential for increasing the aeroelastic stability margins and improving the aircraft productivity. The extension-shear coupling is achieved by using unbalanced 45 plies in the skin. The failure analysis of two tailored panel configurations that have the center stringer and adjacent skin severed is presented. Finite element analysis of the damaged panels was conducted using STAGS (STructural Analysis of General Shells) general purpose finite element program that includes a progressive failure capability for laminated composite structures that is based on point-stress analysis, traditional failure criteria, and ply discounting for material degradation. The progressive failure predicted the path of the failure and maximum load capability. There is less than 12 percent difference between the predicted failure load and experimental failure load. There is a good match of the panel stiffness and strength between the progressive failure analysis and the experimental results. The results indicate that the tailored concept would be feasible to use in the wing skin of a tiltrotor aircraft.

  14. Probing the Statistical Validity of the Ductile-to-Brittle Transition in Metallic Nanowires Using GPU Computing.

    PubMed

    French, William R; Pervaje, Amulya K; Santos, Andrew P; Iacovella, Christopher R; Cummings, Peter T

    2013-12-10

    We perform a large-scale statistical analysis (>2000 independent simulations) of the elongation and rupture of gold nanowires, probing the validity and scope of the recently proposed ductile-to-brittle transition that occurs with increasing nanowire length [Wu et al. Nano Lett. 2012, 12, 910-914]. To facilitate a high-throughput simulation approach, we implement the second-moment approximation to the tight-binding (TB-SMA) potential within HOOMD-Blue, a molecular dynamics package which runs on massively parallel graphics processing units (GPUs). In a statistical sense, we find that the nanowires obey the ductile-to-brittle model quite well; however, we observe several unexpected features from the simulations that build on our understanding of the ductile-to-brittle transition. First, occasional failure behavior is observed that qualitatively differs from that predicted by the model prediction; this is attributed to stochastic thermal motion of the Au atoms and occurs at temperatures as low as 10 K. In addition, we also find that the ductile-to-brittle model, which was developed using classical dislocation theory, holds for nanowires as small as 3 nm in diameter. Finally, we demonstrate that the nanowire critical length is higher at 298 K relative to 10 K, a result that is not predicted by the ductile-to-brittle model. These results offer practical design strategies for adjusting nanowire failure and structure and also demonstrate that GPU computing is an excellent tool for studies requiring a large number of independent trajectories in order to fully characterize a system's behavior.

  15. Randomized clinical trial of encapsulated and hand-mixed glass-ionomer ART restorations: one-year follow-up

    PubMed Central

    Freitas, Maria Cristina Carvalho de Almendra; Fagundes, Ticiane Cestari; Modena, Karin Cristina da Silva; Cardia, Guilherme Saintive; Navarro, Maria Fidela de Lima

    2018-01-01

    Abstract Objective This prospective, randomized, split-mouth clinical trial evaluated the clinical performance of conventional glass ionomer cement (GIC; Riva Self-Cure, SDI), supplied in capsules or in powder/liquid kits and placed in Class I cavities in permanent molars by the Atraumatic Restorative Treatment (ART) approach. Material and Methods A total of 80 restorations were randomly placed in 40 patients aged 11-15 years. Each patient received one restoration with each type of GIC. The restorations were evaluated after periods of 15 days (baseline), 6 months, and 1 year, according to ART criteria. Wilcoxon matched pairs, multivariate logistic regression, and Gehan-Wilcoxon tests were used for statistical analysis. Results Patients were evaluated after 15 days (n=40), 6 months (n=34), and 1 year (n=29). Encapsulated GICs showed significantly superior clinical performance compared with hand-mixed GICs at baseline (p=0.017), 6 months (p=0.001), and 1 year (p=0.026). For hand-mixed GIC, a statistically significant difference was only observed over the period of baseline to 1 year (p=0.001). Encapsulated GIC presented statistically significant differences for the following periods: 6 months to 1 year (p=0.028) and baseline to 1 year (p=0.002). Encapsulated GIC presented superior cumulative survival rate than hand-mixed GIC over one year. Importantly, both GICs exhibited decreased survival over time. Conclusions Encapsulated GIC promoted better ART performance, with an annual failure rate of 24%; in contrast, hand-mixed GIC demonstrated a failure rate of 42%. PMID:29364343

  16. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  17. Effects of enhanced external counterpulsation on skeletal muscle gene expression in patients with severe heart failure.

    PubMed

    Melin, Michael; Montelius, Andreas; Rydén, Lars; Gonon, Adrian; Hagerman, Inger; Rullman, Eric

    2018-01-01

    Enhanced external counterpulsation (EECP) is a non-invasive treatment in which leg cuff compressions increase diastolic aortic pressure and coronary perfusion. EECP is offered to patients with refractory angina pectoris and increases physical capacity. Benefits in heart failure patients have been noted, but EECP is still considered to be experimental and its effects must be confirmed. The mechanism of action is still unclear. The aim of this study was to evaluate the effect of EECP on skeletal muscle gene expression and physical performance in patients with severe heart failure. Patients (n = 9) in NYHA III-IV despite pharmacological therapy were subjected to 35 h of EECP during 7 weeks. Before and after, lateral vastus muscle biopsies were obtained, and functional capacity was evaluated with a 6-min walk test. Skeletal muscle gene expression was evaluated using Affymetrix Hugene 1.0 arrays. Maximum walking distance increased by 15%, which is in parity to that achieved after aerobic exercise training in similar patients. Skeletal muscle gene expression analysis using Ingenuity Pathway Analysis showed an increased expression of two networks of genes with FGF-2 and IGF-1 as central regulators. The increase in gene expression was quantitatively small and no overlap with gene expression profiles after exercise training could be detected despite adequate statistical power. EECP treatment leads to a robust improvement in walking distance in patients with severe heart failure and does induce a skeletal muscle transcriptional response, but this response is small and with no significant overlap with the transcriptional signature seen after exercise training. © 2016 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  18. Utility of the Seattle Heart Failure Model in patients with advanced heart failure.

    PubMed

    Kalogeropoulos, Andreas P; Georgiopoulou, Vasiliki V; Giamouzis, Grigorios; Smith, Andrew L; Agha, Syed A; Waheed, Sana; Laskar, Sonjoy; Puskas, John; Dunbar, Sandra; Vega, David; Levy, Wayne C; Butler, Javed

    2009-01-27

    The aim of this study was to validate the Seattle Heart Failure Model (SHFM) in patients with advanced heart failure (HF). The SHFM was developed primarily from clinical trial databases and extrapolated the benefit of interventions from published data. We evaluated the discrimination and calibration of SHFM in 445 advanced HF patients (age 52 +/- 12 years, 68.5% male, 52.4% white, ejection fraction 18 +/- 8%) referred for cardiac transplantation. The primary end point was death (n = 92), urgent transplantation (n = 14), or left ventricular assist device (LVAD) implantation (n = 3); a secondary analysis was performed on mortality alone. Patients were receiving optimal therapy (angiotensin-II modulation 92.8%, beta-blockers 91.5%, aldosterone antagonists 46.3%), and 71.0% had an implantable device (defibrillator 30.4%, biventricular pacemaker 3.4%, combined 37.3%). During a median follow-up of 21 months, 109 patients (24.5%) had an event. Although discrimination was adequate (c-statistic >0.7), the SHFM overall underestimated absolute risk (observed vs. predicted event rate: 11.0% vs. 9.2%, 21.0% vs. 16.6%, and 27.9% vs. 22.8% at 1, 2, and 3 years, respectively). Risk underprediction was more prominent in patients with an implantable device. The SHFM had different calibration properties in white versus black patients, leading to net underestimation of absolute risk in blacks. Race-specific recalibration improved the accuracy of predictions. When analysis was restricted to mortality, the SHFM exhibited better performance. In patients with advanced HF, the SHFM offers adequate discrimination, but absolute risk is underestimated, especially in blacks and in patients with devices. This is more prominent when including transplantation and LVAD implantation as an end point.

  19. Energy transfer mechanism and probability analysis of submarine pipe laterally impacted by dropped objects

    NASA Astrophysics Data System (ADS)

    Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui

    2016-06-01

    Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.

  20. Flexor tendon repair with a knotless, bidirectional barbed suture: an in vivo biomechanical analysis.

    PubMed

    Maddox, Grady E; Ludwig, Jonathan; Craig, Eric R; Woods, David; Joiner, Aaron; Chaudhari, Nilesh; Killingsworth, Cheryl; Siegal, Gene P; Eberhardt, Alan; Ponce, Brent

    2015-05-01

    To compare and analyze biomechanical properties and histological characteristics of flexor tendons either repaired by a 4-strand modified Kessler technique or using barbed suture with a knotless repair technique in an in vivo model. A total of 25 chickens underwent surgical transection of the flexor digitorum profundus tendon followed by either a 4-strand Kessler repair or a knotless repair with barbed suture. Chickens were randomly assigned to 1 of 3 groups with various postoperative times to death. Harvested tendons were subjected to biomechanical testing or histologic analysis. Harvested tendons revealed failures in 25% of knotless repairs (8 of 32) and 8% of 4-strand Kessler repairs (2 of 24). Biomechanical testing revealed no significant difference in tensile strength between 4-strand Kessler and barbed repairs; however, this lack of difference may be attributed to lower statistical power. We noted a trend toward a gradual decrease in strength over time for barbed repairs, whereas we noticed the opposite for the 4-strand Kessler repairs. Mode of failure during testing differed between repair types. The barbed repairs tended toward suture breakage as opposed to 4-strand Kessler repairs, which demonstrated suture pullout. Histological analysis identified no difference in the degree of inflammation or fibrosis; however, there was a vigorous foreign body reaction around the 4-strand Kessler repair and no such response around the barbed repairs. In this model, knotless barbed repairs trended toward higher in vivo failure rates and biomechanical inferiority under physiologic conditions, with each repair technique differing in mode of failure and respective histologic reaction. We are unable to recommend the use of knotless barbed repair over the 4-strand modified Kessler technique. For the repair techniques tested, surgeons should prefer standard Kessler repairs over the described knotless technique with barbed suture. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  1. Usefulness of combining admission brain natriuretic peptide (BNP) plus hospital discharge bioelectrical impedance vector analysis (BIVA) in predicting 90 days cardiovascular mortality in patients with acute heart failure.

    PubMed

    Santarelli, Simona; Russo, Veronica; Lalle, Irene; De Berardinis, Benedetta; Navarin, Silvia; Magrini, Laura; Piccoli, Antonio; Codognotto, Marta; Castello, Luigi Maria; Avanzi, Gian Carlo; Villacorta, Humberto; Precht, Bernardo Luiz Campanário; de Araújo Porto, Pilar Barreto; Villacorta, Aline Sterque; Di Somma, Salvatore

    2017-06-01

    Heart failure is a disease characterized by high prevalence and mortality, and frequent rehospitalizations. The aim of this study is to investigate the prognostic power of combining brain natriuretic peptide (BNP) and congestion status detected by bioelectrical impedance vector analysis (BIVA) in acute heart failure patients. This is an observational, prospective, and a multicentre study. BNP assessment was measured upon hospital arrival, while BIVA analysis was obtained at the time of discharge. Cardiovascular deaths were evaluated at 90 days by a follow up phone call. 292 patients were enrolled. Compared to survivors, BNP was higher in the non-survivors group (mean value 838 vs 515 pg/ml, p < 0.001). At discharge, BIVA shows a statistically significant difference in hydration status between survivors and non-survivors [respectively, hydration index (HI) 85 vs 74, p < 0.001; reactance (Xc) 26.7 vs 37, p < 0.001; resistance (R) 445 vs 503, p < 0.01)]. Discharge BIVA shows a prognostic value in predicting cardiovascular death [HI: area under the curve (AUC) 0.715, 95% confidence interval (95% CI) 0.65-0.76; p < 0.004; Xc: AUC 0.712, 95% CI 0.655-0.76, p < 0.007; R: AUC 0.65, 95% CI 0.29-0.706, p < 0.0247]. The combination of BIVA with BNP gives a greater prognostic power for cardiovascular mortality [combined receiving operating characteristic (ROC): AUC 0.74; 95% CI 0.68-0.79; p < 0.001]. In acute heart failure patients, higher BNP levels upon hospital admission, and congestion detected by BIVA at discharge have a significant predictive value for 90 days cardiovascular mortality. The combined use of admission BNP and BIVA discharge seems to be a useful tool for increasing prognostic power in these patients.

  2. Organization of heart failure management in European Society of Cardiology member countries: survey of the Heart Failure Association of the European Society of Cardiology in collaboration with the Heart Failure National Societies/Working Groups.

    PubMed

    Seferovic, Petar M; Stoerk, Stefan; Filippatos, Gerasimos; Mareev, Viacheslav; Kavoliuniene, Ausra; Ristic, Arsen D; Ponikowski, Piotr; McMurray, John; Maggioni, Aldo; Ruschitzka, Frank; van Veldhuisen, Dirk J; Coats, Andrew; Piepoli, Massimo; McDonagh, Theresa; Riley, Jillian; Hoes, Arno; Pieske, Burkert; Dobric, Milan; Papp, Zoltan; Mebazaa, Alexandre; Parissis, John; Ben Gal, Tuvia; Vinereanu, Dragos; Brito, Dulce; Altenberger, Johann; Gatzov, Plamen; Milinkovic, Ivan; Hradec, Jaromír; Trochu, Jean-Noel; Amir, Offer; Moura, Brenda; Lainscak, Mitja; Comin, Josep; Wikström, Gerhard; Anker, Stefan

    2013-09-01

    The aim of this document was to obtain a real-life contemporary analysis of the demographics and heart failure (HF) statistics, as well as the organization and major activities of the Heart Failure National Societies (HFNS) in European Society of Cardiology (ESC) member countries. Data from 33 countries were collected from HFNS presidents/representatives during the first Heart Failure Association HFNS Summit (Belgrade, Serbia, 29 October 2011). Data on incidence and/or prevalence of HF were available for 22 countries, and the prevalence of HF ranged between 1% and 3%. In five European and one non-European ESC country, heart transplantation was reported as not available. Natriuretic peptides and echocardiography are routinely applied in the management of acute HF in the median of 80% and 90% of centres, respectively. Eastern European and Mediterranean countries have lower availability of natriuretic peptide testing for acute HF patients, compared with other European countries. Almost all countries have organizations dealing specifically with HF. HFNS societies for HF patients exist in only 12, while in 16 countries HF patient education programmes are active. Most HFNS reported that no national HF registry exists in their country. Fifteen HFNS produced national HF guidelines, while 19 have translated the ESC HF guidelines. Most HFNS (n = 23) participated in the organization of the European HF Awareness Day. This document demonstrated significant heterogeneity in the organization of HF management, and activities of the national HF working groups/associations. High availability of natriuretic peptide and echocardiographic measurements was revealed, with differences between developed countries and countries in transition.

  3. Retrograde pyelography predicts retrograde ureteral stenting failure and reduces unnecessary stenting trials in patients with advanced non-urological malignant ureteral obstruction

    PubMed Central

    Kim, Sung Han; Park, Boram; Joo, Jungnam; Joung, Jae Young; Seo, Ho Kyung; Chung, Jinsoo; Lee, Kang Hyun

    2017-01-01

    Objective To evaluate predictive factors for retrograde ureteral stent failure in patients with non-urological malignant ureteral obstruction. Materials and methods Between 2005 and 2014, medical records of 284 malignant ureteral obstruction patients with 712 retrograde ureteral stent trials including 63 (22.2%) having bilateral malignant ureteral obstruction were retrospectively reviewed. Retrograde ureteral stent failure was defined as the inability to place ureteral stents by cystoscopy, recurrent stent obstruction within one month, or non-relief of azotemia within one week from the prior retrograde ureteral stent. The clinicopathological parameters and first retrograde pyelographic findings were analyzed to investigate the predictive factors for retrograde ureteral stent failure and conversion to percutaneous nephrostomy in multivariate analysis with a statistical significance of p < 0.05. Results Retrograde ureteral stent failure was detected in 14.1% of patients. The mean number of retrograde ureteral stent placements and indwelling duration of the ureteral stents were 2.5 ± 2.6 times and 8.6 ± 4.0 months, respectively. Multivariate analyses identified several specific RGP findings as significant predictive factors for retrograde ureteral stent failure (p < 0.05). The significant retrograde pyelographic findings included grade 4 hydronephrosis (hazard ratio 4.10, 95% confidence interval 1.39–12.09), irreversible ureteral kinking (hazard ratio 2.72, confidence interval 1.03–7.18), presence of bladder invasion (hazard ratio 4.78, confidence interval 1.81–12.63), and multiple lesions of ureteral stricture (hazard ratio 3.46, confidence interval 1.35–8.83) (p < 0.05). Conclusion Retrograde pyelography might prevent unnecessary and ineffective retrograde ureteral stent trials in patients with advanced non-urological malignant ureteral obstruction. PMID:28931043

  4. Being on sick leave due to heart failure: self-rated health, encounters with healthcare professionals and social insurance officers and self-estimated ability to return to work.

    PubMed

    Nordgren, Lena; Söderlund, Anne

    2015-01-01

    Younger people with heart failure often experience poor self-rated health. Furthermore, poor self-rated health is associated with long-term sick leave and disability pension. Socio-demographic factors affect the ability to return to work. However, little is known about people on sick leave due to heart failure. The aim of this study was to investigate associations between self-rated health, mood, socio-demographic factors, sick leave compensation, encounters with healthcare professionals and social insurance officers and self-estimated ability to return to work, for people on sick leave due to heart failure. This population-based investigation had a cross-sectional design. Data were collected in Sweden in 2012 from two official registries and from a postal questionnaire. In total, 590 subjects, aged 23-67, responded (response rate 45.8%). Descriptive statistics, correlation analyses (Spearman bivariate analysis) and logistic regression analyses were used to investigate associations. Poor self-rated health was strongly associated with full sick leave compensation (OR = 4.1, p < .001). Compared self-rated health was moderately associated with low income (OR =  .6, p =  .003). Good self-rated health was strongly associated with positive encounters with healthcare professionals (OR = 3.0, p =  .022) and to the impact of positive encounters with healthcare professionals on self-estimated ability to return to work (OR = 3.3, p < .001). People with heart failure are sicklisted for long periods of time and to a great extent receive disability pension. Not being able to work imposes reduced quality of life. Positive encounters with healthcare professionals and social insurance officers can be supportive when people with heart failure struggle to remain in working life.

  5. Update of Dutch multicenter dose-escalation trial of radiotherapy for localized prostate cancer.

    PubMed

    Al-Mamgani, Abrahim; van Putten, Wim L J; Heemsbergen, Wilma D; van Leenders, Geert J L H; Slot, Annerie; Dielwart, Michel F H; Incrocci, Luca; Lebesque, Joos V

    2008-11-15

    To update the analysis of the Dutch dose-escalation trial of radiotherapy for prostate cancer. A total of 669 patients with localized prostate cancer were randomly assigned to receive 68 or 78 Gy. The patients were stratified by age, institution, use of neoadjuvant or adjuvant hormonal therapy, and treatment group. The primary endpoint was freedom from failure (FFF), with failure defined as clinical or biochemical failure. Two definitions of biochemical failure were used: the American Society for Therapeutic Radiology and Oncology definition (three consecutive increases in prostate-specific antigen level) and the Phoenix definition (nadir plus 2 microe secondary endpoints were freedom from clinical failure, overall survival, and genitourinary and gastrointestinal toxicity. After a median follow-up of 70 months, the FFF using the American Society for Therapeutic Radiology and Oncology definition was significantly better in the 78-Gy arm than in the 68-Gy arm (7-year FFF rate, 54% vs. 47%, respectively; p = 0.04). The FFF using the Phoenix definition was also significantly better in the 78-Gy arm than in the 68-Gy arm (7-year FFF rate, 56% vs. 45%, respectively; p = 0.03). However, no differences in freedom from clinical failure or overall survival were observed. The incidence of late Grade 2 or greater genitourinary toxicity was similar in both arms (40% and 41% at 7 years; p = 0.6). However, the cumulative incidence of late Grade 2 or greater gastrointestinal toxicity was increased in the 78-Gy arm compared with the 68-Gy arm (35% vs. 25% at 7 years; p = 0.04). The results of our study have shown a statistically significant improvement in FFF in prostate cancer patients treated with 78 Gy but with a greater rate of late gastrointestinal toxicity.

  6. Evaluation of the Effect of Perceived Social Support on Promoting Self-Care Behaviors of Heart Failure Patients Referred to The Cardiovascular Research Center of Isfahan

    PubMed Central

    Khaledi, Gholam Hassan; Mostafavi, Firoozeh; Eslami, Ahmad Ali; Rooh Afza, Hamidreza; Mostafavi, Firoozeh; Akbar, Hassanzadeh

    2015-01-01

    Background: Self-care is one of the most important aspects of treatment in patients with heart failure and ranks among the most important coping strategies against the events and stresses of life. Perceived social support plays an important role in performing self-care behaviors in these patients. Objectives: This study was conducted to evaluate the effect of perceived social support on promoting self-care behaviors among heart failure patients. Patients and Methods: This educational intervention with a randomized control group was performed on 64 heart failure patients referred to The Cardiovascular Research Center of Isfahan. The study population was divided randomly into two groups of intervention and control. The indicators of self-care behavior and perceived social support (before, immediately after, and 2 months after the intervention) were completed by the two groups. The intervention group received educational interventions in 120-minute sessions once a week for 4 weeks. SPSS software (version 20) was used for data analysis in addition to methods of descriptive and inferential statistics. Results: Based on the obtained results, educational intervention was effective in the improvement of perceived social support among our heart failure patients. The results also showed that an increase in perceived social support significantly promoted self-care behaviors in the case group after the intervention compared with the control group (P < 0.001). Conclusions: Perceived social support played an important role in improving the performance of self-care behaviors in our heart failure patients. Given the strengths of the present study, these findings can be considered in future research in this domain. PMID:26328063

  7. Bond strength with various etching times on young permanent teeth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, W.N.; Lu, T.C.

    1991-07-01

    Tensile bond strengths of an orthodontic resin cement were compared for 15-, 30-, 60-, 90-, or 120-second etching times, with a 37% phosphoric acid solution on the enamel surfaces of young permanent teeth. Fifty extracted premolars from 9- to 16-year-old children were used for testing. An orthodontic composite resin was used to bond the bracket directly onto the buccal surface of the enamel. The tensile bond strengths were tested with an Instron machine. Bond failure interfaces between bracket bases and teeth surfaces were examined with a scanning electron microscope and calculated with mapping of energy-dispersive x-ray spectrometry. The results ofmore » tensile bond strength for 15-, 30-, 60-, or 90-second etching times were not statistically different. For the 120-second etching time, the decrease was significant. Of the bond failures, 43%-49% occurred between bracket and resin interface, 12% to 24% within the resin itself, 32%-40% between resin and tooth interface, and 0% to 4% contained enamel fragments. There was no statistical difference in percentage of bond failure interface distribution between bracket base and resin, resin and enamel, or the enamel detachment. Cohesive failure within the resin itself at the 120-second etching time was less than at other etching times, with a statistical significance. To achieve good retention, to decrease enamel loss, and to reduce moisture contamination in the clinic, as well as to save chairside time, a 15-second etching time is suggested for teenage orthodontic patients.« less

  8. Military Health Service System Ambulatory Work Unit (AWU).

    DTIC Science & Technology

    1988-04-01

    E-40 BBC-4 Ambulatory Work Unit Distribution Screen Passes BBC - Neurosurgery Clinic .... ............. . E-40 BBD -I Initial Record...Screen Failures BBD - Ophthalmology Clinic ... ............ E-41 BBD -2 Distribution Screen Failures BBD - Ophthalmology Clinic ............ E-41 BBD -3...Descriptive Statistics Distribution Screen Passes BBD - Ophthalmology Clinic ............ E-42 BBD -4 Ambulatory Work Unit Distribution Screen Passes BBD

  9. A detailed description of the sequential probability ratio test for 2-IMU FDI

    NASA Technical Reports Server (NTRS)

    Rich, T. M.

    1976-01-01

    The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.

  10. Metabolomic analysis based on 1H-nuclear magnetic resonance spectroscopy metabolic profiles in tuberculous, malignant and transudative pleural effusion

    PubMed Central

    Wang, Cheng; Peng, Jingjin; Kuang, Yanling; Zhang, Jiaqiang; Dai, Luming

    2017-01-01

    Pleural effusion is a common clinical manifestation with various causes. Current diagnostic and therapeutic methods have exhibited numerous limitations. By involving the analysis of dynamic changes in low molecular weight catabolites, metabolomics has been widely applied in various types of disease and have provided platforms to distinguish many novel biomarkers. However, to the best of our knowledge, there are few studies regarding the metabolic profiling for pleural effusion. In the current study, 58 pleural effusion samples were collected, among which 20 were malignant pleural effusions, 20 were tuberculous pleural effusions and 18 were transudative pleural effusions. The small molecule metabolite spectrums were obtained by adopting 1H nuclear magnetic resonance technology, and pattern-recognition multi-variable statistical analysis was used to screen out different metabolites. One-way analysis of variance, and Student-Newman-Keuls and the Kruskal-Wallis test were adopted for statistical analysis. Over 400 metabolites were identified in the untargeted metabolomic analysis and 26 metabolites were identified as significantly different among tuberculous, malignant and transudative pleural effusions. These metabolites were predominantly involved in the metabolic pathways of amino acids metabolism, glycometabolism and lipid metabolism. Statistical analysis revealed that eight metabolites contributed to the distinction between the three groups: Tuberculous, malignant and transudative pleural effusion. In the current study, the feasibility of identifying small molecule biochemical profiles in different types of pleural effusion were investigated reveal novel biological insights into the underlying mechanisms. The results provide specific insights into the biology of tubercular, malignant and transudative pleural effusion and may offer novel strategies for the diagnosis and therapy of associated diseases, including tuberculosis, advanced lung cancer and congestive heart failure. PMID:28627685

  11. Geographic Hotspots of Critical National Infrastructure.

    PubMed

    Thacker, Scott; Barr, Stuart; Pant, Raghav; Hall, Jim W; Alderson, David

    2017-12-01

    Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location. © 2017 Society for Risk Analysis.

  12. Analysis of recent failures of disease modifying therapies in Alzheimer's disease suggesting a new methodology for future studies.

    PubMed

    Amanatkar, Hamid Reza; Papagiannopoulos, Bill; Grossberg, George Thomas

    2017-01-01

    Pharmaceutical companies and the NIH have invested heavily in a variety of potential disease-modifying therapies for Alzheimer's disease (AD) but unfortunately all double-blind placebo-controlled Phase III studies of these drugs have failed to show statistically significant results supporting their clinical efficacy on cognitive measures. These negative results are surprising as most of these medications have the capability to impact the biomarkers which are associated with progression of Alzheimer's disease. Areas covered: This contradiction prompted us to review all study phases of Intravenous Immunoglobulin (IVIG), Bapineuzumab, Solanezumab, Avagacestat and Dimebolin to shed more light on these recent failures. We critically analyzed these studies, recommending seven lessons from these failures which should not be overlooked. Expert commentary: We suggest a new methodology for future treatment research in Alzheimer's disease considering early intervention with more focus on cognitive decline as a screening tool, more sophisticated exclusion criteria with more reliance on biomarkers, stratification of subjects based on the rate of cognitive decline aiming less heterogeneity, and a longer study duration with periodic assessment of cognition and activities of daily living during the study and also after a washout period.

  13. Arteriovenous fistula maturation in patients with permanent access created prior to or after hemodialysis initiation.

    PubMed

    Duque, Juan C; Martinez, Laisel; Tabbara, Marwan; Dvorquez, Denise; Mehandru, Sushil K; Asif, Arif; Vazquez-Padron, Roberto I; Salman, Loay H

    2017-05-15

    Multiple factors and comorbidities have been implicated in the ability of arteriovenous fistulas (AVF) to mature, including vessel anatomy, advanced age, and the presence of coronary artery disease or peripheral vascular disease. However, little is known about the role of uremia on AVF primary failure. In this study, we attempt to evaluate the effect of uremia on AVF maturation by comparing AVF outcomes between pre-dialysis chronic kidney disease (CKD) stage five patients and those who had their AVF created after hemodialysis (HD) initiation. We included 612 patients who underwent AVF creation between 2003 and 2015 at the University of Miami Hospital and Jackson Memorial Hospital. Effects of uremia on primary failure were evaluated using univariate statistical comparisons and multivariate logistic regression analyses. Primary failure occurred in 28.1% and 26.3% of patients with an AVF created prior to or after HD initiation, respectively (p = 0.73). The time of HD initiation was not associated with AVF maturation in multivariate logistic regression analysis (p = 0.57). In addition, pre-operative blood urea nitrogen (p = 0.78), estimated glomerular filtration rate (p = 0.66), and serum creatinine levels (p = 0.14) were not associated with AVF primary failure in pre-dialysis patients. Our results show that clearance of uremia with regular HD treatments prior to AVF creation does not improve the frequency of vascular access maturation.

  14. Microshear bond strength and finite element analysis of resin composite adhesion to press-on-metal ceramic for repair actions after various conditioning methods.

    PubMed

    Kanat, Burcu; Cömlekoğlu, M Erhan; Cömlekoğlu, Mine Dündar; Culha, Osman; Ozcan, Mutlu; Güngör, Mehmet Ali

    2014-02-01

    This study evaluated the repair bond strength of differently surface-conditioned press-on-metal ceramic to repair composites and determined the location of the accumulated stresses by finite element analysis. Press-on-metal ceramic disks (IPS InLine PoM, Ivoclar Vivadent) (N = 45, diameter: 3 mm, height: 2 mm) were randomly divided into 3 groups (n = 15 per group) and conditioned with one of the following methods: 9.5% hydrofluoric acid (HF) (Porcelain etch), tribochemical silica coating (TS) (CoJet), and an unconditioned group acted as the control (C). Each group was divided into three subgroups depending on the repair composite resins: a) Arabesk Top (V, a microhybrid; VOCO), b) Filtek Z250 (F, a hybrid;3M ESPE); c) Tetric EvoCeram (T, a nanohybrid; Ivoclar Vivadent) (n = 5 per subgroup). Repair composites disks (diameter: 1 mm, height: 1 mm) were photopolymerized on each ceramic block. Microshear bond strength (MSB) tests were performed (1 mm/min) and the obtained data were statistically analyzed using 2-way ANOVA and Tukey's post-hoc test (α = 0.05). Failure types were analyzed under SEM. Vickers indentation hardness, Young's modulus, and finite element analysis (FEA) were performed complementary to MSB tests to determine stress accumulation areas. MSB results were significantly affected by the surface conditioning methods (p = 0.0001), whereas the repair composite types did not show a significant effect (p = 0.108). The interaction terms between the repair composite and surface conditioning method were also statistically significant (p = 0.0001). The lowest MSB values (MPa ± SD) were obtained in the control group (V = 4 ± 0.8; F = 3.9 ± 0.7; T = 4.1 ± 0.7) (p < 0.05). While the group treated with T composite resulted in significantly lower MSB values for the HF group (T= 4.1 ± 0.8) compared to those of other composites (V = 8.1 ± 2.6; F = 7.6 ± 2.2) (p < 0.05), there were no significant differences when TS was used as a conditioning method (V = 5 ± 1.7; F = 4.7 ± 1; T = 6.2 ± 0.8) (p > 0.05). The control group presented exclusively adhesive failures. Cohesive failures in composite followed by mixed failure types were more common in HF and TS conditioned groups. Elasticity modulus of the composites were 22.9, 12.09, and 10.41 GPa for F, T, and V, respectively. Vickers hardness of the composites were 223, 232, and 375 HV for V, T, and F, respectively. Von Mises stresses in the FEA analysis for the V and T composites spread over a large area due to the low elastic modulus of the composite, whereas the F composite material accumulated more stresses at the bonded interface. Press-on-metal ceramic could best be repaired using tribochemical silica coating followed by silanization, regardless of the repair composite type in combination with their corresponding adhesive resins, providing that no cohesive ceramic failure was observed.

  15. Randomized controlled trial comparing nasal intermittent positive pressure ventilation and nasal continuous positive airway pressure in premature infants after tracheal extubation.

    PubMed

    Komatsu, Daniela Franco Rizzo; Diniz, Edna Maria de Albuquerque; Ferraro, Alexandre Archanjo; Ceccon, Maria Esther Jurvest Rivero; Vaz, Flávio Adolfo Costa

    2016-09-01

    To analyze the frequency of extubation failure in premature infants using conventional mechanical ventilation (MV) after extubation in groups subjected to nasal intermittent positive pressure ventilation (nIPPV) and continuous positive airway pressure (nCPAP). Seventy-two premature infants with respiratory failure were studied, with a gestational age (GA) ≤ 36 weeks and birth weight (BW) > 750 g, who required tracheal intubation and mechanical ventilation. The study was controlled and randomized in order to ensure that the members of the groups used in the research were chosen at random. Randomization was performed at the time of extubation using sealed envelopes. Extubation failure was defined as the need for re-intubation and mechanical ventilation during the first 72 hours after extubation. Among the 36 premature infants randomized to nIPPV, six (16.6%) presented extubation failure in comparison to 11 (30.5%) of the 36 premature infants randomized to nCPAP. There was no statistical difference between the two study groups regarding BW, GA, classification of the premature infant, and MV time. The main cause of extubation failure was the occurrence of apnea. Gastrointestinal and neurological complications did not occur in the premature infants participating in the study. We found that, despite the extubation failure of the group of premature infants submitted to nIPPV being numerically smaller than in premature infants submitted to nCPAP, there was no statistically significant difference between the two modes of ventilatory support after extubation.

  16. Health-Related Quality of Life in Heart Failure Patients With Varying Levels of Health Literacy Receiving Telemedicine and Standardized Education.

    PubMed

    Yehle, Karen S; Plake, Kimberly S; Nguyen, Patricia; Smith, Diane

    2016-05-01

    The purpose of this study was to examine the effect of telemonitoring plus education by home healthcare nurses on health-related quality of life in patients with heart failure who had varying health literacy levels. In this pretest/posttest treatment only study, 35 patients with a diagnosis of heart failure received home healthcare nurse visits, including education and telemonitoring. Heart failure education was provided by nurses at each home healthcare visit for approximately 15 to 20 minutes. All participants completed the Short-Form Test of Functional Health Literacy in Adults (S-TOFHLA) and the Minnesota Living with Heart Failure Questionnaire (MLHFQ) during the first week of home healthcare services. The MLHFQ was administered again at the completion of the covered home healthcare services period (1-3 visits per week for 10 weeks). Most participants were older adults (mean age 70.91±12.47) and had adequate health literacy (51.4%). Almost half of the participants were NYHA Class III (47.1%). All participants received individual heart failure education, but this did not result in statistically significant improvements in health-related quality-of-life scores. With telemonitoring and home healthcare nurse visits, quality-of-life scores improved by the conclusion of home healthcare services (clinically significant), but the change was not statistically significant. Individuals with marginal and inadequate health literacy ability were able to correctly use the telemonitoring devices.

  17. Retrieval and clinical analysis of distraction-based dual growing rod constructs for early-onset scoliosis.

    PubMed

    Hill, Genevieve; Nagaraja, Srinidhi; Akbarnia, Behrooz A; Pawelek, Jeff; Sponseller, Paul; Sturm, Peter; Emans, John; Bonangelino, Pablo; Cockrum, Joshua; Kane, William; Dreher, Maureen

    2017-10-01

    Growing rod constructs are an important contribution for treating patients with early-onset scoliosis. These devices experience high failure rates, including rod fractures. The objective of this study was to identify the failure mechanism of retrieved growing rods, and to identify differences between patients with failed and intact constructs. Growing rod patients who had implant removal and were previously enrolled in a multicenter registry were eligible for this study. Forty dual-rod constructs were retrieved from 36 patients across four centers, and 34 of those constructs met the inclusion criteria. Eighteen constructs failed due to rod fracture. Sixteen intact constructs were removed due to final fusion (n=7), implant exchange (n=5), infection (n=2), or implant prominence (n=2). Analyses of clinical registry data, radiographs, and retrievals were the outcome measures. Retrievals were analyzed with microscopic imaging (optical and scanning electron microscopy) for areas of mechanical failure, damage, and corrosion. Failure analyses were conducted on the fracture surfaces to identify failure mechanism(s). Statistical analyses were performed to determine significant differences between the failed and intact groups. The failed rods fractured due to bending fatigue under flexion motion. Construct configuration and loading dictate high bending stresses at three distinct locations along the construct: (1) mid-construct, (2) adjacent to the tandem connector, or (3) adjacent to the distal anchor foundation. In addition, high torques used to insert set screws may create an initiation point for fatigue. Syndromic scoliosis, prior rod fractures, increase in patient weight, and rigid constructs consisting of tandem connectors and multiple crosslinks were associated with failure. This is the first study to examine retrieved, failed growing rod implants across multiple centers. Our analysis found that rod fractures are due to bending fatigue, and that stress concentrations play an important role in rod fractures. Recommendations are made on surgical techniques, such as the use of torque-limiting wrenches or not exceeding the prescribed torques. Additional recommendations include frequent rod replacement in select patients during scheduled surgeries. Published by Elsevier Inc.

  18. A Novel MiRNA-Based Predictive Model for Biochemical Failure Following Post-Prostatectomy Salvage Radiation Therapy

    PubMed Central

    Stegmaier, Petra; Drendel, Vanessa; Mo, Xiaokui; Ling, Stella; Fabian, Denise; Manring, Isabel; Jilg, Cordula A.; Schultze-Seemann, Wolfgang; McNulty, Maureen; Zynger, Debra L.; Martin, Douglas; White, Julia; Werner, Martin; Grosu, Anca L.; Chakravarti, Arnab

    2015-01-01

    Purpose To develop a microRNA (miRNA)-based predictive model for prostate cancer patients of 1) time to biochemical recurrence after radical prostatectomy and 2) biochemical recurrence after salvage radiation therapy following documented biochemical disease progression post-radical prostatectomy. Methods Forty three patients who had undergone salvage radiation therapy following biochemical failure after radical prostatectomy with greater than 4 years of follow-up data were identified. Formalin-fixed, paraffin-embedded tissue blocks were collected for all patients and total RNA was isolated from 1mm cores enriched for tumor (>70%). Eight hundred miRNAs were analyzed simultaneously using the nCounter human miRNA v2 assay (NanoString Technologies; Seattle, WA). Univariate and multivariate Cox proportion hazards regression models as well as receiver operating characteristics were used to identify statistically significant miRNAs that were predictive of biochemical recurrence. Results Eighty eight miRNAs were identified to be significantly (p<0.05) associated with biochemical failure post-prostatectomy by multivariate analysis and clustered into two groups that correlated with early (≤ 36 months) versus late recurrence (>36 months). Nine miRNAs were identified to be significantly (p<0.05) associated by multivariate analysis with biochemical failure after salvage radiation therapy. A new predictive model for biochemical recurrence after salvage radiation therapy was developed; this model consisted of miR-4516 and miR-601 together with, Gleason score, and lymph node status. The area under the ROC curve (AUC) was improved to 0.83 compared to that of 0.66 for Gleason score and lymph node status alone. Conclusion miRNA signatures can distinguish patients who fail soon after radical prostatectomy versus late failures, giving insight into which patients may need adjuvant therapy. Notably, two novel miRNAs (miR-4516 and miR-601) were identified that significantly improve prediction of biochemical failure post-salvage radiation therapy compared to clinico-histopathological factors, supporting the use of miRNAs within clinically used predictive models. Both findings warrant further validation studies. PMID:25760964

  19. Anatomic and Biomechanical Comparison of Traditional Bankart Repair With Bone Tunnels and Bankart Repair Utilizing Suture Anchors

    PubMed Central

    Judson, Christopher H.; Charette, Ryan; Cavanaugh, Zachary; Shea, Kevin P.

    2016-01-01

    Background: Traditional Bankart repair using bone tunnels has a reported failure rate between 0% and 5% in long-term studies. Arthroscopic Bankart repair using suture anchors has become more popular; however, reported failure rates have been cited between 4% and 18%. There have been no satisfactory explanations for the differences in these outcomes. Hypothesis: Bone tunnels will provide increased coverage of the native labral footprint and demonstrate greater load to failure and stiffness and decreased cyclic displacement in biomechanical testing. Study Design: Controlled laboratory study. Methods: Twenty-two fresh-frozen cadaveric shoulders were used. For footprint analysis, the labral footprint area was marked and measured using a Microscribe technique in 6 specimens. A 3-suture anchor repair was performed, and the area of the uncovered footprint was measured. This was repeated with traditional bone tunnel repair. For the biomechanical analysis, 8 paired specimens were randomly assigned to bone tunnel or suture anchor repair with the contralateral specimen assigned to the other technique. Each specimen underwent cyclic loading (5-25 N, 1 Hz, 100 cycles) and load to failure (15 mm/min). Displacement was measured using a digitized video recording system. Results: Bankart repair with bone tunnels provided significantly more coverage of the native labral footprint than repair with suture anchors (100% vs 27%, P < .001). Repair with bone tunnels (21.9 ± 8.7 N/mm) showed significantly greater stiffness than suture anchor repair (17.1 ± 3.5 N/mm, P = .032). Mean load to failure and gap formation after cyclic loading were not statistically different between bone tunnel (259 ± 76.8 N, 0.209 ± 0.064 mm) and suture anchor repairs (221.5 ± 59.0 N [P = .071], 0.161 ± 0.51 mm [P = .100]). Conclusion: Bankart repair with bone tunnels completely covered the footprint anatomy while suture anchor repair covered less than 30% of the native footprint. Repair using bone tunnels resulted in significantly greater stiffness than repair with suture anchors. Load to failure and gap formation were not significantly different. PMID:26779555

  20. Anatomic and Biomechanical Comparison of Traditional Bankart Repair With Bone Tunnels and Bankart Repair Utilizing Suture Anchors.

    PubMed

    Judson, Christopher H; Charette, Ryan; Cavanaugh, Zachary; Shea, Kevin P

    2016-01-01

    Traditional Bankart repair using bone tunnels has a reported failure rate between 0% and 5% in long-term studies. Arthroscopic Bankart repair using suture anchors has become more popular; however, reported failure rates have been cited between 4% and 18%. There have been no satisfactory explanations for the differences in these outcomes. Bone tunnels will provide increased coverage of the native labral footprint and demonstrate greater load to failure and stiffness and decreased cyclic displacement in biomechanical testing. Controlled laboratory study. Twenty-two fresh-frozen cadaveric shoulders were used. For footprint analysis, the labral footprint area was marked and measured using a Microscribe technique in 6 specimens. A 3-suture anchor repair was performed, and the area of the uncovered footprint was measured. This was repeated with traditional bone tunnel repair. For the biomechanical analysis, 8 paired specimens were randomly assigned to bone tunnel or suture anchor repair with the contralateral specimen assigned to the other technique. Each specimen underwent cyclic loading (5-25 N, 1 Hz, 100 cycles) and load to failure (15 mm/min). Displacement was measured using a digitized video recording system. Bankart repair with bone tunnels provided significantly more coverage of the native labral footprint than repair with suture anchors (100% vs 27%, P < .001). Repair with bone tunnels (21.9 ± 8.7 N/mm) showed significantly greater stiffness than suture anchor repair (17.1 ± 3.5 N/mm, P = .032). Mean load to failure and gap formation after cyclic loading were not statistically different between bone tunnel (259 ± 76.8 N, 0.209 ± 0.064 mm) and suture anchor repairs (221.5 ± 59.0 N [P = .071], 0.161 ± 0.51 mm [P = .100]). Bankart repair with bone tunnels completely covered the footprint anatomy while suture anchor repair covered less than 30% of the native footprint. Repair using bone tunnels resulted in significantly greater stiffness than repair with suture anchors. Load to failure and gap formation were not significantly different.

Top