Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.
1999-01-01
A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.
An efficient scan diagnosis methodology according to scan failure mode for yield enhancement
NASA Astrophysics Data System (ADS)
Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok
2008-12-01
Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
2008-01-01
Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.
An overview of computational simulation methods for composite structures failure and life analysis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1993-01-01
Three parallel computational simulation methods are being developed at the LeRC Structural Mechanics Branch (SMB) for composite structures failure and life analysis: progressive fracture CODSTRAN; hierarchical methods for high-temperature composites; and probabilistic evaluation. Results to date demonstrate that these methods are effective in simulating composite structures failure/life/reliability.
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
NASA Technical Reports Server (NTRS)
Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.
1992-01-01
Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.
NASA Astrophysics Data System (ADS)
Li, N.; Cheng, Y. M.
2015-01-01
Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient detail. There is however increasing interest in the consequences after the initiation of failure that includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more detail and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and a laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanisms and the post-failure mechanisms of slopes will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure, which can give additional information not available from the classical methods of analysis.
Laboratory and 3-D-distinct element analysis of failure mechanism of slope under external surcharge
NASA Astrophysics Data System (ADS)
Li, N.; Cheng, Y. M.
2014-09-01
Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient details. There are however increasing interest on the consequences after the initiation of failure which includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more details and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanism and the post-failure mechanism of slope will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure which can give additional information not available from the classical methods of analysis.
Reliability Evaluation of Machine Center Components Based on Cascading Failure Analysis
NASA Astrophysics Data System (ADS)
Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang
2017-07-01
In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.
User-Defined Material Model for Progressive Failure Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)
2006-01-01
An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Failure-Modes-And-Effects Analysis Of Software Logic
NASA Technical Reports Server (NTRS)
Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David
1996-01-01
Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.
NASA Technical Reports Server (NTRS)
Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl
2017-01-01
The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.
NASA Astrophysics Data System (ADS)
Zheng, W.; Gao, J. M.; Wang, R. X.; Chen, K.; Jiang, Y.
2017-12-01
This paper put forward a new method of technical characteristics deployment based on Reliability Function Deployment (RFD) by analysing the advantages and shortages of related research works on mechanical reliability design. The matrix decomposition structure of RFD was used to describe the correlative relation between failure mechanisms, soft failures and hard failures. By considering the correlation of multiple failure modes, the reliability loss of one failure mode to the whole part was defined, and a calculation and analysis model for reliability loss was presented. According to the reliability loss, the reliability index value of the whole part was allocated to each failure mode. On the basis of the deployment of reliability index value, the inverse reliability method was employed to acquire the values of technology characteristics. The feasibility and validity of proposed method were illustrated by a development case of machining centre’s transmission system.
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Failure Mode, Effects, and Criticality Analysis (FMECA)
1993-04-01
Preliminary Failure Modes, Effects and Criticality Analysis (FMECA) of the Brayton Isotope Power System Ground Demonstration System, Report No. TID 27301...No. TID/SNA - 3015, Aeroject Nuclear Systems Co., Sacramento, California: 1970. 95. Taylor , J.R. A Formalization of Failure Mode Analysis of Control...Roskilde, Denmark: 1973. 96. Taylor , J.R. A Semi-Automatic Method for Oualitative Failure Mode Analysis. Report No. RISO-M-1707. Available from a
The Identification of Software Failure Regions
1990-06-01
be used to detect non-obviously redundant test cases. A preliminary examination of the manual analysis method is performed with a set of programs ...failure regions are defined and a method of failure region analysis is described in detail. The thesis describes how this analysis may be used to detect...is the termination of the ability of a functional unit to perform its required function. (Glossary, 1983) The presence of faults in program code
Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh
2017-03-01
Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.
NASA Astrophysics Data System (ADS)
Mahmood, Faleh H.; Kadhim, Hussein T.; Resen, Ali K.; Shaban, Auday H.
2018-05-01
The failure such as air gap weirdness, rubbing, and scrapping between stator and rotor generator arise unavoidably and may cause extremely terrible results for a wind turbine. Therefore, we should pay more attention to detect and identify its cause-bearing failure in wind turbine to improve the operational reliability. The current paper tends to use of power spectral density analysis method of detecting internal race and external race bearing failure in micro wind turbine by estimation stator current signal of the generator. The failure detector method shows that it is well suited and effective for bearing failure detection.
NASA Technical Reports Server (NTRS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.
2008-01-01
Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.
NASA Astrophysics Data System (ADS)
Zhao, Yong; Yang, Tianhong; Bohnhoff, Marco; Zhang, Penghai; Yu, Qinglei; Zhou, Jingren; Liu, Feiyue
2018-05-01
To quantitatively understand the failure process and failure mechanism of a rock mass during the transformation from open-pit mining to underground mining, the Shirengou Iron Mine was selected as an engineering project case study. The study area was determined using the rock mass basic quality classification method and the kinematic analysis method. Based on the analysis of the variations in apparent stress and apparent volume over time, the rock mass failure process was analyzed. According to the recent research on the temporal and spatial change of microseismic events in location, energy, apparent stress, and displacement, the migration characteristics of rock mass damage were studied. A hybrid moment tensor inversion method was used to determine the rock mass fracture source mechanisms, the fracture orientations, and fracture scales. The fracture area can be divided into three zones: Zone A, Zone B, and Zone C. A statistical analysis of the orientation information of the fracture planes orientations was carried out, and four dominant fracture planes were obtained. Finally, the slip tendency analysis method was employed, and the unstable fracture planes were obtained. The results show: (1) The microseismic monitoring and hybrid moment tensor analysis can effectively analyze the failure process and failure mechanism of rock mass, (2) during the transformation from open-pit to underground mining, the failure type of rock mass is mainly shear failure and the tensile failure is mostly concentrated in the roof of goafs, and (3) the rock mass of the pit bottom and the upper of goaf No. 18 have the possibility of further damage.
Evaluating wood failure in plywood shear by optical image analysis
Charles W. McMillin
1984-01-01
This exploratory study evaulates the potential of using an automatic image analysis method to measure percent wood failure in plywood shear specimens. The results suggest that this method my be as accurate as the visual method in tracking long-term gluebond quality. With further refinement, the method could lead to automated equipment replacing the subjective visual...
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Risk management of key issues of FPSO
NASA Astrophysics Data System (ADS)
Sun, Liping; Sun, Hai
2012-12-01
Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
NASA Astrophysics Data System (ADS)
Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin
2015-03-01
Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.
Nouri.Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-01-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed. PMID:26779433
Nouri Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-04-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed.
The role of shear and tensile failure in dynamically triggered landslides
Gipprich, T.L.; Snieder, R.K.; Jibson, R.W.; Kimman, W.
2008-01-01
Dynamic stresses generated by earthquakes can trigger landslides. Current methods of landslide analysis such as pseudo-static analysis and Newmark's method focus on the effects of earthquake accelerations on the landslide mass to characterize dynamic landslide behaviour. One limitation of these methods is their use Mohr-Coulomb failure criteria, which only accounts for shear failure, but the role of tensile failure is not accounted for. We develop a limit-equilibrium model to investigate the dynamic stresses generated by a given ground motion due to a plane wave and use this model to assess the role of shear and tensile failure in the initiation of slope instability. We do so by incorporating a modified Griffith failure envelope, which combines shear and tensile failure into a single criterion. Tests of dynamic stresses in both homogeneous and layered slopes demonstrate that two modes of failure exist, tensile failure in the uppermost meters of a slope and shear failure at greater depth. Further, we derive equations that express the dynamic stress in the near-surface in the acceleration measured at the surface. These equations are used to approximately define the depth range for each mechanism of failure. The depths at which these failure mechanisms occur suggest that shear and tensile failure might collaborate in generating slope failure. ?? 2007 The Authors Journal compilation ?? 2007 RAS.
Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.
Preliminary analysis techniques for ring and stringer stiffened cylindrical shells
NASA Technical Reports Server (NTRS)
Graham, J.
1993-01-01
This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.
NASA Astrophysics Data System (ADS)
Simola, Kaisa; Laakso, Kari
1992-01-01
Eight years of operating experiences of 104 motor operated closing valves in different safety systems in nuclear power units were analyzed in a systematic way. The qualitative methods used were Failure Mode and Effect Analysis (FMEA) and Maintenance Effects and Criticality Analysis (MECA). These reliability engineering methods are commonly used in the design stage of equipment. The successful application of these methods for analysis and utilization of operating experiences was demonstrated.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Computational Methods for Failure Analysis and Life Prediction
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)
1993-01-01
This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.
NASA Astrophysics Data System (ADS)
Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori
This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Song-Hua; Chang, James Y. H.; Boring,Ronald L.
2010-03-01
The Office of Nuclear Regulatory Research (RES) at the US Nuclear Regulatory Commission (USNRC) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method's middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identifiedmore » human failure events, analysts identify potential failure mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.« less
Failure analysis in the identification of synergies between cleaning monitoring methods.
Whiteley, Greg S; Derry, Chris; Glasbey, Trevor
2015-02-01
The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
A Comparison of Functional Models for Use in the Function-Failure Design Method
NASA Technical Reports Server (NTRS)
Stock, Michael E.; Stone, Robert B.; Tumer, Irem Y.
2006-01-01
When failure analysis and prevention, guided by historical design knowledge, are coupled with product design at its conception, shorter design cycles are possible. By decreasing the design time of a product in this manner, design costs are reduced and the product will better suit the customer s needs. Prior work indicates that similar failure modes occur with products (or components) with similar functionality. To capitalize on this finding, a knowledge base of historical failure information linked to functionality is assembled for use by designers. One possible use for this knowledge base is within the Elemental Function-Failure Design Method (EFDM). This design methodology and failure analysis tool begins at conceptual design and keeps the designer cognizant of failures that are likely to occur based on the product s functionality. The EFDM offers potential improvement over current failure analysis methods, such as FMEA, FMECA, and Fault Tree Analysis, because it can be implemented hand in hand with other conceptual design steps and carried throughout a product s design cycle. These other failure analysis methods can only truly be effective after a physical design has been completed. The EFDM however is only as good as the knowledge base that it draws from, and therefore it is of utmost importance to develop a knowledge base that will be suitable for use across a wide spectrum of products. One fundamental question that arises in using the EFDM is: At what level of detail should functional descriptions of components be encoded? This paper explores two approaches to populating a knowledge base with actual failure occurrence information from Bell 206 helicopters. Functional models expressed at various levels of detail are investigated to determine the necessary detail for an applicable knowledge base that can be used by designers in both new designs as well as redesigns. High level and more detailed functional descriptions are derived for each failed component based on NTSB accident reports. To best record this data, standardized functional and failure mode vocabularies are used. Two separate function-failure knowledge bases are then created aid compared. Results indicate that encoding failure data using more detailed functional models allows for a more robust knowledge base. Interestingly however, when applying the EFDM, high level descriptions continue to produce useful results when using the knowledge base generated from the detailed functional models.
Space Shuttle Stiffener Ring Foam Failure Analysis, a Non-Conventional Approach
NASA Technical Reports Server (NTRS)
Howard, Philip M.
2015-01-01
The Space Shuttle Program made use of the excellent properties of rigid polyurethane foam for cryogenic tank insulation and as structural protection on the solid rocket boosters. When foam applications de-bond, classical methods of failure analysis did not provide root cause of the failure of the foam. Realizing that foam is the ideal media to document and preserve its own mode of failure, thin sectioning was seen as a logical approach for foam failure analysis to observe the three dimensional morphology of the foam cells. The cell foam morphology provided a much greater understanding of the failure modes than previously achieved.
A simplified fragility analysis of fan type cable stayed bridges
NASA Astrophysics Data System (ADS)
Khan, R. A.; Datta, T. K.; Ahmad, S.
2005-06-01
A simplified fragility analysis of fan type cable stayed bridges using Probabilistic Risk Analysis (PRA) procedure is presented for determining their failure probability under random ground motion. Seismic input to the bridge support is considered to be a risk consistent response spectrum which is obtained from a separate analysis. For the response analysis, the bridge deck is modeles as a beam supported on spring at different points. The stiffnesses of the springs are determined by a separate 2D static analysis of cable-tower-deck system. The analysis provides a coupled stiffness matrix for the spring system. A continuum method of analysis using dynamic stiffness is used to determine the dynamic properties of the bridges. The response of the bridge deck is obtained by the response spectrum method of analysis as applied to multidegree of freedom system which duly takes into account the quasi-static component of bridge deck vibration. The fragility analysis includes uncertainties arising due to the variation in ground motion, material property, modeling, method of analysis, ductility factor and damage concentration effect. Probability of failure of the bridge deck is determined by the First Order Second Moment (FOSM) method of reliability. A three span double plane symmetrical fan type cable stayed bridge of total span 689 m, is used as an illustrative example. The fragility curves for the bridge deck failure are obtained under a number of parametric variations. Some of the important conclusions of the study indicate that (i) not only vertical component but also the horizontal component of ground motion has considerable effect on the probability of failure; (ii) ground motion with no time lag between support excitations provides a smaller probability of failure as compared to ground motion with very large time lag between support excitation; and (iii) probability of failure may considerably increase soft soil condition.
NASA Technical Reports Server (NTRS)
Anstead, R. J. (Editor); Goldberg, E. (Editor)
1975-01-01
Failure analysis test methods are presented for use in analyzing candidate electronic parts and in improving future design reliability. Each test is classified as nondestructive, semidestructive, or destructive. The effects upon applicable part types (i.e. integrated circuit, transitor) are discussed. Methodology is given for performing the following: immersion tests, radio graphic tests, dewpoint tests, gas ambient analysis, cross sectioning, and ultraviolet examination.
Sensor Failure Detection of FASSIP System using Principal Component Analysis
NASA Astrophysics Data System (ADS)
Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina
2018-02-01
In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.
Ouyang, Min; Tian, Hui; Wang, Zhenghua; Hong, Liu; Mao, Zijun
2017-01-17
This article studies a general type of initiating events in critical infrastructures, called spatially localized failures (SLFs), which are defined as the failure of a set of infrastructure components distributed in a spatially localized area due to damage sustained, while other components outside the area do not directly fail. These failures can be regarded as a special type of intentional attack, such as bomb or explosive assault, or a generalized modeling of the impact of localized natural hazards on large-scale systems. This article introduces three SLFs models: node centered SLFs, district-based SLFs, and circle-shaped SLFs, and proposes a SLFs-induced vulnerability analysis method from three aspects: identification of critical locations, comparisons of infrastructure vulnerability to random failures, topologically localized failures and SLFs, and quantification of infrastructure information value. The proposed SLFs-induced vulnerability analysis method is finally applied to the Chinese railway system and can be also easily adapted to analyze other critical infrastructures for valuable protection suggestions. © 2017 Society for Risk Analysis.
A Framework for Creating a Function-based Design Tool for Failure Mode Identification
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.
Simplified methods for evaluating road prism stability
William J. Elliot; Mark Ballerini; David Hall
2003-01-01
Mass failure is one of the most common failures of low-volume roads in mountainous terrain. Current methods for evaluating stability of these roads require a geotechnical specialist. A stability analysis program, XSTABL, was used to estimate the stability of 3,696 combinations of road geometry, soil, and groundwater conditions. A sensitivity analysis was carried out to...
Analysis of the STS-126 Flow Control Valve Structural-Acoustic Coupling Failure
NASA Technical Reports Server (NTRS)
Jones, Trevor M.; Larko, Jeffrey M.; McNelis, Mark E.
2010-01-01
During the Space Transportation System mission STS-126, one of the main engine's flow control valves incurred an unexpected failure. A section of the valve broke off during liftoff. It is theorized that an acoustic mode of the flowing fuel, coupled with a structural mode of the valve, causing a high cycle fatigue failure. This report documents the analysis efforts conducted in an attempt to verify this theory. Hand calculations, computational fluid dynamics, and finite element methods are all implemented and analyses are performed using steady-state methods in addition to transient analysis methods. The conclusion of the analyses is that there is a critical acoustic mode that aligns with a structural mode of the valve
Rogers, Jennifer K; Pocock, Stuart J; McMurray, John J V; Granger, Christopher B; Michelson, Eric L; Östergren, Jan; Pfeffer, Marc A; Solomon, Scott D; Swedberg, Karl; Yusuf, Salim
2014-01-01
Heart failure is characterized by recurrent hospitalizations, but often only the first event is considered in clinical trial reports. In chronic diseases, such as heart failure, analysing all events gives a more complete picture of treatment benefit. We describe methods of analysing repeat hospitalizations, and illustrate their value in one major trial. The Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity (CHARM)-Preserved study compared candesartan with placebo in 3023 patients with heart failure and preserved systolic function. The heart failure hospitalization rates were 12.5 and 8.9 per 100 patient-years in the placebo and candesartan groups, respectively. The repeat hospitalizations were analysed using the Andersen-Gill, Poisson, and negative binomial methods. Death was incorporated into analyses by treating it as an additional event. The win ratio method and a method that jointly models hospitalizations and mortality were also considered. Using repeat events gave larger treatment benefits than time to first event analysis. The negative binomial method for the composite of recurrent heart failure hospitalizations and cardiovascular death gave a rate ratio of 0.75 [95% confidence interval (CI) 0.62-0.91, P = 0.003], whereas the hazard ratio for time to first heart failure hospitalization or cardiovascular death was 0.86 (95% CI 0.74-1.00, P = 0.050). In patients with preserved EF, candesartan reduces the rate of admissions for worsening heart failure, to a greater extent than apparent from analysing only first hospitalizations. Recurrent events should be routinely incorporated into the analysis of future clinical trials in heart failure. © 2013 The Authors. European Journal of Heart Failure © 2013 European Society of Cardiology.
Failure Mode Identification Through Clustering Analysis
NASA Technical Reports Server (NTRS)
Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
Research has shown that nearly 80% of the costs and problems are created in product development and that cost and quality are essentially designed into products in the conceptual stage. Currently, failure identification procedures (such as FMEA (Failure Modes and Effects Analysis), FMECA (Failure Modes, Effects and Criticality Analysis) and FTA (Fault Tree Analysis)) and design of experiments are being used for quality control and for the detection of potential failure modes during the detail design stage or post-product launch. Though all of these methods have their own advantages, they do not give information as to what are the predominant failures that a designer should focus on while designing a product. This work uses a functional approach to identify failure modes, which hypothesizes that similarities exist between different failure modes based on the functionality of the product/component. In this paper, a statistical clustering procedure is proposed to retrieve information on the set of predominant failures that a function experiences. The various stages of the methodology are illustrated using a hypothetical design example.
NASA Astrophysics Data System (ADS)
Sotokoba, Yasumasa; Okajima, Kenji; Iida, Toshiaki; Tanaka, Tadatsugu
We propose the trenchless box culvert construction method to construct box culverts in small covering soil layers while keeping roads or tracks open. When we use this construction method, it is necessary to clarify deformation and shear failure by excavation of grounds. In order to investigate the soil behavior, model experiments and elasto-plactic finite element analysis were performed. In the model experiments, it was shown that the shear failure was developed from the end of the roof to the toe of the boundary surface. In the finite element analysis, a shear band effect was introduced. Comparing the observed shear bands in model experiments with computed maximum shear strain contours, it was found that the observed direction of the shear band could be simulated reasonably by the finite element analysis. We may say that the finite element method used in this study is useful tool for this construction method.
Chen, Ling; Feng, Yanqin; Sun, Jianguo
2017-10-01
This paper discusses regression analysis of clustered failure time data, which occur when the failure times of interest are collected from clusters. In particular, we consider the situation where the correlated failure times of interest may be related to cluster sizes. For inference, we present two estimation procedures, the weighted estimating equation-based method and the within-cluster resampling-based method, when the correlated failure times of interest arise from a class of additive transformation models. The former makes use of the inverse of cluster sizes as weights in the estimating equations, while the latter can be easily implemented by using the existing software packages for right-censored failure time data. An extensive simulation study is conducted and indicates that the proposed approaches work well in both the situations with and without informative cluster size. They are applied to a dental study that motivated this study.
NASA Astrophysics Data System (ADS)
Belmonte, D.; Vedova, M. D. L. Dalla; Ferro, C.; Maggiore, P.
2017-06-01
The proposal of prognostic algorithms able to identify precursors of incipient failures of primary flight command electromechanical actuators (EMA) is beneficial for the anticipation of the incoming failure: an early and correct interpretation of the failure degradation pattern, in fact, can trig an early alert of the maintenance crew, who can properly schedule the servomechanism replacement. An innovative prognostic model-based approach, able to recognize the EMA progressive degradations before his anomalous behaviors become critical, is proposed: the Fault Detection and Identification (FDI) of the considered incipient failures is performed analyzing proper system operational parameters, able to put in evidence the corresponding degradation path, by means of a numerical algorithm based on spectral analysis techniques. Subsequently, these operational parameters will be correlated with the actual EMA health condition by means of failure maps created by a reference monitoring model-based algorithm. In this work, the proposed method has been tested in case of EMA affected by combined progressive failures: in particular, partial stator single phase turn to turn short-circuit and rotor static eccentricity are considered. In order to evaluate the prognostic method, a numerical test-bench has been conceived. Results show that the method exhibit adequate robustness and a high degree of confidence in the ability to early identify an eventual malfunctioning, minimizing the risk of fake alarms or unannounced failures.
Failure Analysis of CCD Image Sensors Using SQUID and GMR Magnetic Current Imaging
NASA Technical Reports Server (NTRS)
Felt, Frederick S.
2005-01-01
During electrical testing of a Full Field CCD Image Senor, electrical shorts were detected on three of six devices. These failures occurred after the parts were soldered to the PCB. Failure analysis was performed to determine the cause and locations of these failures on the devices. After removing the fiber optic faceplate, optical inspection was performed on the CCDs to understand the design and package layout. Optical inspection revealed that the device had a light shield ringing the CCD array. This structure complicated the failure analysis. Alternate methods of analysis were considered, including liquid crystal, light and thermal emission, LT/A, TT/A SQUID, and MP. Of these, SQUID and MP techniques were pursued for further analysis. Also magnetoresistive current imaging technology is discussed and compared to SQUID.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
Probabilistic analysis on the failure of reactivity control for the PWR
NASA Astrophysics Data System (ADS)
Sony Tjahyani, D. T.; Deswandri; Sunaryo, G. R.
2018-02-01
The fundamental safety function of the power reactor is to control reactivity, to remove heat from the reactor, and to confine radioactive material. The safety analysis is used to ensure that each parameter is fulfilled during the design and is done by deterministic and probabilistic method. The analysis of reactivity control is important to be done because it will affect the other of fundamental safety functions. The purpose of this research is to determine the failure probability of the reactivity control and its failure contribution on a PWR design. The analysis is carried out by determining intermediate events, which cause the failure of reactivity control. Furthermore, the basic event is determined by deductive method using the fault tree analysis. The AP1000 is used as the object of research. The probability data of component failure or human error, which is used in the analysis, is collected from IAEA, Westinghouse, NRC and other published documents. The results show that there are six intermediate events, which can cause the failure of the reactivity control. These intermediate events are uncontrolled rod bank withdrawal at low power or full power, malfunction of boron dilution, misalignment of control rod withdrawal, malfunction of improper position of fuel assembly and ejection of control rod. The failure probability of reactivity control is 1.49E-03 per year. The causes of failures which are affected by human factor are boron dilution, misalignment of control rod withdrawal and malfunction of improper position for fuel assembly. Based on the assessment, it is concluded that the failure probability of reactivity control on the PWR is still within the IAEA criteria.
Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure
NASA Astrophysics Data System (ADS)
Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak
2017-09-01
Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.
Method of Testing and Predicting Failures of Electronic Mechanical Systems
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, Frances A.
1996-01-01
A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.
Probabilistic finite elements for fracture and fatigue analysis
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.
1989-01-01
The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.
Progressive Failure Analysis of Composite Stiffened Panels
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Yarrington, Phillip W.; Collier, Craig S.; Arnold, Steven M.
2006-01-01
A new progressive failure analysis capability for stiffened composite panels has been developed based on the combination of the HyperSizer stiffened panel design/analysis/optimization software with the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC). MAC/GMC discretizes a composite material s microstructure into a number of subvolumes and solves for the stress and strain state in each while providing the homogenized composite properties as well. As a result, local failure criteria may be employed to predict local subvolume failure and the effects of these local failures on the overall composite response. When combined with HyperSizer, MAC/GMC is employed to represent the ply level composite material response within the laminates that constitute a stiffened panel. The effects of local subvolume failures can then be tracked as loading on the stiffened panel progresses. Sample progressive failure results are presented at both the composite laminate and the composite stiffened panel levels. Deformation and failure model predictions are compared with experimental data from the World Wide Failure Exercise for AS4/3501-6 graphite/epoxy laminates.
NASA Astrophysics Data System (ADS)
Rucitra, A. L.
2018-03-01
Pusat Koperasi Induk Susu (PKIS) Sekar Tanjung, East Java is one of the modern dairy industries producing Ultra High Temperature (UHT) milk. A problem that often occurs in the production process in PKIS Sekar Tanjung is a mismatch between the production process and the predetermined standard. The purpose of applying Analytical Hierarchy Process (AHP) was to identify the most potential cause of failure in the milk production process. Multi Attribute Failure Mode Analysis (MAFMA) method was used to eliminate or reduce the possibility of failure when viewed from the failure causes. This method integrates the severity, occurrence, detection, and expected cost criteria obtained from depth interview with the head of the production department as an expert. The AHP approach was used to formulate the priority ranking of the cause of failure in the milk production process. At level 1, the severity has the highest weight of 0.41 or 41% compared to other criteria. While at level 2, identifying failure in the UHT milk production process, the most potential cause was the average mixing temperature of more than 70 °C which was higher than the standard temperature (≤70 ° C). This failure cause has a contributes weight of 0.47 or 47% of all criteria Therefore, this study suggested the company to control the mixing temperature to minimise or eliminate the failure in this process.
Improving FMEA risk assessment through reprioritization of failures
NASA Astrophysics Data System (ADS)
Ungureanu, A. L.; Stan, G.
2016-08-01
Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.
Preventing blood transfusion failures: FMEA, an effective assessment method.
Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza
2017-06-30
Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.
NASA Technical Reports Server (NTRS)
Gaebler, John A.; Tolson, Robert H.
2010-01-01
In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.
Computational methods for efficient structural reliability and reliability sensitivity analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.
1993-01-01
This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.
Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications
1992-09-01
STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Homce, G.T.; Thalimer, J.R.
1996-05-01
Most electric motor predictive maintenance methods have drawbacks that limit their effectiveness in the mining environment. The US Bureau of Miens (USBM) is developing an alternative approach to detect winding insulation breakdown in advance of complete motor failure. In order to evaluate the analysis algorithms necessary for this approach, the USBM has designed and installed a system to monitor 120 electric motors in a coal preparation plant. The computer-based experimental system continuously gathers, stores, and analyzes electrical parameters for each motor. The results are then correlated to data from conventional motor-maintenance methods and in-service failures to determine if the analysismore » algorithms can detect signs of insulation deterioration and impending failure. This paper explains the on-line testing approach used in this research, and describes monitoring system design and implementation. At this writing data analysis is underway, but conclusive results are not yet available.« less
System and method for floating-substrate passive voltage contrast
Jenkins, Mark W [Albuquerque, NM; Cole, Jr., Edward I.; Tangyunyong, Paiboon [Albuquerque, NM; Soden, Jerry M [Placitas, NM; Walraven, Jeremy A [Albuquerque, NM; Pimentel, Alejandro A [Albuquerque, NM
2009-04-28
A passive voltage contrast (PVC) system and method are disclosed for analyzing ICs to locate defects and failure mechanisms. During analysis a device side of a semiconductor die containing the IC is maintained in an electrically-floating condition without any ground electrical connection while a charged particle beam is scanned over the device side. Secondary particle emission from the device side of the IC is detected to form an image of device features, including electrical vias connected to transistor gates or to other structures in the IC. A difference in image contrast allows the defects or failure mechanisms be pinpointed. Varying the scan rate can, in some instances, produce an image reversal to facilitate precisely locating the defects or failure mechanisms in the IC. The system and method are useful for failure analysis of ICs formed on substrates (e.g. bulk semiconductor substrates and SOI substrates) and other types of structures.
Medication management strategies used by older adults with heart failure: A systems-based analysis.
Mickelson, Robin S; Holden, Richard J
2017-09-01
Older adults with heart failure use strategies to cope with the constraining barriers impeding medication management. Strategies are behavioral adaptations that allow goal achievement despite these constraining conditions. When strategies do not exist, are ineffective or maladaptive, medication performance and health outcomes are at risk. While constraints to medication adherence are described in literature, strategies used by patients to manage medications are less well-described or understood. Guided by cognitive engineering concepts, the aim of this study was to describe and analyze the strategies used by older adults with heart failure to achieve their medication management goals. This mixed methods study employed an empirical strategies analysis method to elicit medication management strategies used by older adults with heart failure. Observation and interview data collected from 61 older adults with heart failure and 31 caregivers were analyzed using qualitative content analysis to derive categories, patterns and themes within and across cases. Data derived thematic sub-categories described planned and ad hoc methods of strategic adaptations. Stable strategies proactively adjusted the medication management process, environment, or the patients themselves. Patients applied situational strategies (planned or ad hoc) to irregular or unexpected situations. Medication non-adherence was a strategy employed when life goals conflicted with medication adherence. The health system was a source of constraints without providing commensurate strategies. Patients strived to control their medication system and achieve goals using adaptive strategies. Future patient self-mangement research can benefit from methods and theories used to study professional work, such as strategies analysis.
Yousefinezhadi, Taraneh; Jannesar Nobari, Farnaz Attar; Goodari, Faranak Behzadi; Arab, Mohammad
2016-01-01
Introduction: In any complex human system, human error is inevitable and shows that can’t be eliminated by blaming wrong doers. So with the aim of improving Intensive Care Units (ICU) reliability in hospitals, this research tries to identify and analyze ICU’s process failure modes at the point of systematic approach to errors. Methods: In this descriptive research, data was gathered qualitatively by observations, document reviews, and Focus Group Discussions (FGDs) with the process owners in two selected ICUs in Tehran in 2014. But, data analysis was quantitative, based on failures’ Risk Priority Number (RPN) at the base of Failure Modes and Effects Analysis (FMEA) method used. Besides, some causes of failures were analyzed by qualitative Eindhoven Classification Model (ECM). Results: Through FMEA methodology, 378 potential failure modes from 180 ICU activities in hospital A and 184 potential failures from 99 ICU activities in hospital B were identified and evaluated. Then with 90% reliability (RPN≥100), totally 18 failures in hospital A and 42 ones in hospital B were identified as non-acceptable risks and then their causes were analyzed by ECM. Conclusions: Applying of modified PFMEA for improving two selected ICUs’ processes reliability in two different kinds of hospitals shows that this method empowers staff to identify, evaluate, prioritize and analyze all potential failure modes and also make them eager to identify their causes, recommend corrective actions and even participate in improving process without feeling blamed by top management. Moreover, by combining FMEA and ECM, team members can easily identify failure causes at the point of health care perspectives. PMID:27157162
Modelling Coastal Cliff Recession Based on the GIM-DDD Method
NASA Astrophysics Data System (ADS)
Gong, Bin; Wang, Shanyong; Sloan, Scott William; Sheng, Daichao; Tang, Chun'an
2018-04-01
The unpredictable and instantaneous collapse behaviour of coastal rocky cliffs may cause damage that extends significantly beyond the area of failure. Gravitational movements that occur during coastal cliff recession involve two major stages: the small deformation stage and the large displacement stage. In this paper, a method of simulating the entire progressive failure process of coastal rocky cliffs is developed based on the gravity increase method (GIM), the rock failure process analysis method and the discontinuous deformation analysis method, and it is referred to as the GIM-DDD method. The small deformation stage, which includes crack initiation, propagation and coalescence processes, and the large displacement stage, which includes block translation and rotation processes during the rocky cliff collapse, are modelled using the GIM-DDD method. In addition, acoustic emissions, stress field variations, crack propagation and failure mode characteristics are further analysed to provide insights that can be used to predict, prevent and minimize potential economic losses and casualties. The calculation and analytical results are consistent with previous studies, which indicate that the developed method provides an effective and reliable approach for performing rocky cliff stability evaluations and coastal cliff recession analyses and has considerable potential for improving the safety and protection of seaside cliff areas.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Global resilience analysis of water distribution systems.
Diao, Kegong; Sweetapple, Chris; Farmani, Raziyeh; Fu, Guangtao; Ward, Sarah; Butler, David
2016-12-01
Evaluating and enhancing resilience in water infrastructure is a crucial step towards more sustainable urban water management. As a prerequisite to enhancing resilience, a detailed understanding is required of the inherent resilience of the underlying system. Differing from traditional risk analysis, here we propose a global resilience analysis (GRA) approach that shifts the objective from analysing multiple and unknown threats to analysing the more identifiable and measurable system responses to extreme conditions, i.e. potential failure modes. GRA aims to evaluate a system's resilience to a possible failure mode regardless of the causal threat(s) (known or unknown, external or internal). The method is applied to test the resilience of four water distribution systems (WDSs) with various features to three typical failure modes (pipe failure, excess demand, and substance intrusion). The study reveals GRA provides an overview of a water system's resilience to various failure modes. For each failure mode, it identifies the range of corresponding failure impacts and reveals extreme scenarios (e.g. the complete loss of water supply with only 5% pipe failure, or still meeting 80% of demand despite over 70% of pipes failing). GRA also reveals that increased resilience to one failure mode may decrease resilience to another and increasing system capacity may delay the system's recovery in some situations. It is also shown that selecting an appropriate level of detail for hydraulic models is of great importance in resilience analysis. The method can be used as a comprehensive diagnostic framework to evaluate a range of interventions for improving system resilience in future studies. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Fatigue of notched fiber composite laminates. Part 1: Analytical model
NASA Technical Reports Server (NTRS)
Mclaughlin, P. V., Jr.; Kulkarni, S. V.; Huang, S. N.; Rosen, B. W.
1975-01-01
A description is given of a semi-empirical, deterministic analysis for prediction and correlation of fatigue crack growth, residual strength, and fatigue lifetime for fiber composite laminates containing notches (holes). The failure model used for the analysis is based upon composite heterogeneous behavior and experimentally observed failure modes under both static and fatigue loading. The analysis is consistent with the wearout philosophy. Axial cracking and transverse cracking failure modes are treated together in the analysis. Cracking off-axis is handled by making a modification to the axial cracking analysis. The analysis predicts notched laminate failure from unidirectional material fatique properties using constant strain laminate analysis techniques. For multidirectional laminates, it is necessary to know lamina fatique behavior under axial normal stress, transverse normal stress and axial shear stress. Examples of the analysis method are given.
Probabilistic Analysis of a Composite Crew Module
NASA Technical Reports Server (NTRS)
Mason, Brian H.; Krishnamurthy, Thiagarajan
2011-01-01
An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.
To the systematization of failure analysis for perturbed systems (in German)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haller, U.
1974-01-01
The paper investigates the reliable functioning of complex technical systems. Of main importance is the question of how the functioning of technical systems which may fail or whose design still has some faults can be determined in the very earliest planning stages. The present paper is to develop a functioning schedule and to look for possible methods of systematic failure analysis of systems with stochastic failures. (RW/AK)
Risk assessment of failure modes of gas diffuser liner of V94.2 siemens gas turbine by FMEA method
NASA Astrophysics Data System (ADS)
Mirzaei Rafsanjani, H.; Rezaei Nasab, A.
2012-05-01
Failure of welding connection of gas diffuser liner and exhaust casing is one of the failure modes of V94.2 gas turbines which are happened in some power plants. This defect is one of the uncertainties of customers when they want to accept the final commissioning of this product. According to this, the risk priority of this failure evaluated by failure modes and effect analysis (FMEA) method to find out whether this failure is catastrophic for turbine performance and is harmful for humans. By using history of 110 gas turbines of this model which are used in some power plants, the severity number, occurrence number and detection number of failure determined and consequently the Risk Priority Number (RPN) of failure determined. Finally, critically matrix of potential failures is created and illustrated that failure modes are located in safe zone.
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
An improved method for risk evaluation in failure modes and effects analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Rachieru, N.; Belu, N.; Anghel, D. C.
2015-11-01
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.
NASA Langley developments in response calculations needed for failure and life prediction
NASA Technical Reports Server (NTRS)
Housner, Jerrold M.
1993-01-01
NASA Langley developments in response calculations needed for failure and life predictions are discussed. Topics covered include: structural failure analysis in concurrent engineering; accuracy of independent regional modeling demonstrated on classical example; functional interface method accurately joins incompatible finite element models; interface method for insertion of local detail modeling extended to curve pressurized fuselage window panel; interface concept for joining structural regions; motivation for coupled 2D-3D analysis; compression panel with discontinuous stiffener coupled 2D-3D model and axial surface strains at the middle of the hat stiffener; use of adaptive refinement with multiple methods; adaptive mesh refinement; and studies on quantity effect of bow-type initial imperfections on reliability of stiffened panels.
Lifetime evaluation of large format CMOS mixed signal infrared devices
NASA Astrophysics Data System (ADS)
Linder, A.; Glines, Eddie
2015-09-01
New large scale foundry processes continue to produce reliable products. These new large scale devices continue to use industry best practice to screen for failure mechanisms and validate their long lifetime. The Failure-in-Time analysis in conjunction with foundry qualification information can be used to evaluate large format device lifetimes. This analysis is a helpful tool when zero failure life tests are typical. The reliability of the device is estimated by applying the failure rate to the use conditions. JEDEC publications continue to be the industry accepted methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stacey M. L. Hendrickson; April M. Whaley; Ronald L. Boring
The Office of Nuclear Regulatory Research (RES) is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. As part of this effort an attempt to develop a comprehensive HRA qualitative approach is being pursued. This paper presents a draft of the method’s middle layer, a part of the qualitative analysis phase that links failure mechanisms to performance shaping factors. Starting with a Crew Response Tree (CRT) that has identified human failure events, analysts identify potential failuremore » mechanisms using the mid-layer model. The mid-layer model presented in this paper traces the identification of the failure mechanisms using the Information-Diagnosis/Decision-Action (IDA) model and cognitive models from the psychological literature. Each failure mechanism is grouped according to a phase of IDA. Under each phase of IDA, the cognitive models help identify the relevant performance shaping factors for the failure mechanism. The use of IDA and cognitive models can be traced through fault trees, which provide a detailed complement to the CRT.« less
Risk analysis by FMEA as an element of analytical validation.
van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M
2009-12-05
We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.
Failure analysis of energy storage spring in automobile composite brake chamber
NASA Astrophysics Data System (ADS)
Luo, Zai; Wei, Qing; Hu, Xiaofeng
2015-02-01
This paper set energy storage spring of parking brake cavity, part of automobile composite brake chamber, as the research object. And constructed the fault tree model of energy storage spring which caused parking brake failure based on the fault tree analysis method. Next, the parking brake failure model of energy storage spring was established by analyzing the working principle of composite brake chamber. Finally, the data of working load and the push rod stroke measured by comprehensive test-bed valve was used to validate the failure model above. The experimental result shows that the failure model can distinguish whether the energy storage spring is faulted.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
NASA Astrophysics Data System (ADS)
Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.
2016-03-01
Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Advanced Fault Diagnosis Methods in Molecular Networks
Habibi, Iman; Emamian, Effat S.; Abdi, Ali
2014-01-01
Analysis of the failure of cell signaling networks is an important topic in systems biology and has applications in target discovery and drug development. In this paper, some advanced methods for fault diagnosis in signaling networks are developed and then applied to a caspase network and an SHP2 network. The goal is to understand how, and to what extent, the dysfunction of molecules in a network contributes to the failure of the entire network. Network dysfunction (failure) is defined as failure to produce the expected outputs in response to the input signals. Vulnerability level of a molecule is defined as the probability of the network failure, when the molecule is dysfunctional. In this study, a method to calculate the vulnerability level of single molecules for different combinations of input signals is developed. Furthermore, a more complex yet biologically meaningful method for calculating the multi-fault vulnerability levels is suggested, in which two or more molecules are simultaneously dysfunctional. Finally, a method is developed for fault diagnosis of networks based on a ternary logic model, which considers three activity levels for a molecule instead of the previously published binary logic model, and provides equations for the vulnerabilities of molecules in a ternary framework. Multi-fault analysis shows that the pairs of molecules with high vulnerability typically include a highly vulnerable molecule identified by the single fault analysis. The ternary fault analysis for the caspase network shows that predictions obtained using the more complex ternary model are about the same as the predictions of the simpler binary approach. This study suggests that by increasing the number of activity levels the complexity of the model grows; however, the predictive power of the ternary model does not appear to be increased proportionally. PMID:25290670
Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya
2016-12-01
To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
Reid, M. E.; Iverson, R. M.; Brien, D. L.; Iverson, N. R.; Lahusen, R. G.; Logan, M.
2004-12-01
Most studies of landslide initiation employ limit equilibrium analyses of slope stability. Owing to a lack of detailed data, however, few studies have tested limit-equilibrium predictions against physical measurements of slope failure. We have conducted a series of field-scale, highly controlled landslide initiation experiments at the USGS debris-flow flume in Oregon; these experiments provide exceptional data to test limit equilibrium methods. In each of seven experiments, we attempted to induce failure in a 0.65m thick, 2m wide, 6m3 prism of loamy sand placed behind a retaining wall in the 31° sloping flume. We systematically investigated triggering of sliding by groundwater injection, by prolonged moderate-intensity sprinkling, and by bursts of high intensity sprinkling. We also used vibratory compaction to control soil porosity and thereby investigate differences in failure behavior of dense and loose soils. About 50 sensors were monitored at 20 Hz during the experiments, including nests of tiltmeters buried at 7 cm spacing to define subsurface failure geometry, and nests of tensiometers and pore-pressure sensors to define evolving pore-pressure fields. In addition, we performed ancillary laboratory tests to measure soil porosity, shear strength, hydraulic conductivity, and compressibility. In loose soils (porosity of 0.52 to 0.55), abrupt failure typically occurred along the flume bed after substantial soil deformation. In denser soils (porosity of 0.41 to 0.44), gradual failure occurred within the soil prism. All failure surfaces had a maximum length to depth ratio of about 7. In even denser soil (porosity of 0.39), we could not induce failure by sprinkling. The internal friction angle of the soils varied from 28° to 40° with decreasing porosity. We analyzed stability at failure, given the observed pore-pressure conditions just prior to large movement, using a 1-D infinite-slope method and a more complete 2-D Janbu method. Each method provides a static Factor of Safety (FS), and in theory failure occurs when FS ≤ 1. Using the 1-D analysis, all experiments having failure had FS well below 1 (typically 0.5-0.8). Using the 2-D analysis for these same conditions, FS was less than but closer to 1 (typically 0.8-0.9). For the experiment with no failure, the 2-D FS was, reassuringly, > 1. These results indicate that the 2-D Janbu analysis is more accurate than the 1-D infinite-slope method for computing limit-equilibrium slope stability in shallow slides with limited areal extent.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Risk Analysis Methods for Deepwater Port Oil Transfer Systems
DOT National Transportation Integrated Search
1976-06-01
This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.
Xin, Cao; Chongshi, Gu
2016-01-01
Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.
Space Shuttle Stiffener Ring Foam Failure, a Non-Conventional Approach
NASA Technical Reports Server (NTRS)
Howard, Philip M.
2007-01-01
The Space Shuttle makes use of the excellent properties of rigid polyurethane foam for cryogenic tank insulation and as structural protection on the solid rocket boosters. When foam applications debond, classical methods of analysis do not always provide root cause of the failure of the foam. Realizing that foam is the ideal media to document and preserve its own mode of failure, thin sectioning was seen as a logical approach for foam failure analysis. Thin sectioning in two directions, both horizontal and vertical to the application, was chosen to observe the three dimensional morphology of the foam cells. The cell foam morphology provided a much greater understanding of the failure modes than previously achieved.
Probabilistic structural analysis methods for space transportation propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.
1991-01-01
Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .
Determination of UAV pre-flight Checklist for flight test purpose using qualitative failure analysis
NASA Astrophysics Data System (ADS)
Hendarko; Indriyanto, T.; Syardianto; Maulana, F. A.
2018-05-01
Safety aspects are of paramount importance in flight, especially in flight test phase. Before performing any flight tests of either manned or unmanned aircraft, one should include pre-flight checklists as a required safety document in the flight test plan. This paper reports on the development of a new approach for determination of pre-flight checklists for UAV flight test based on aircraft’s failure analysis. The Lapan’s LSA (Light Surveillance Aircraft) is used as a study case, assuming this aircraft has been transformed into the unmanned version. Failure analysis is performed on LSA using fault tree analysis (FTA) method. Analysis is focused on propulsion system and flight control system, which fail of these systems will lead to catastrophic events. Pre-flight checklist of the UAV is then constructed based on the basic causes obtained from failure analysis.
Comprehensive risk assessment method of catastrophic accident based on complex network properties
NASA Astrophysics Data System (ADS)
Cui, Zhen; Pang, Jun; Shen, Xiaohong
2017-09-01
On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.
NASA Technical Reports Server (NTRS)
Williams, R. E.; Kruger, R.
1980-01-01
Estimation procedures are described for measuring component failure rates, for comparing the failure rates of two different groups of components, and for formulating confidence intervals for testing hypotheses (based on failure rates) that the two groups perform similarly or differently. Appendix A contains an example of an analysis in which these methods are applied to investigate the characteristics of two groups of spacecraft components. The estimation procedures are adaptable to system level testing and to monitoring failure characteristics in orbit.
Optimized Vertex Method and Hybrid Reliability
NASA Technical Reports Server (NTRS)
Smith, Steven A.; Krishnamurthy, T.; Mason, B. H.
2002-01-01
A method of calculating the fuzzy response of a system is presented. This method, called the Optimized Vertex Method (OVM), is based upon the vertex method but requires considerably fewer function evaluations. The method is demonstrated by calculating the response membership function of strain-energy release rate for a bonded joint with a crack. The possibility of failure of the bonded joint was determined over a range of loads. After completing the possibilistic analysis, the possibilistic (fuzzy) membership functions were transformed to probability density functions and the probability of failure of the bonded joint was calculated. This approach is called a possibility-based hybrid reliability assessment. The possibility and probability of failure are presented and compared to a Monte Carlo Simulation (MCS) of the bonded joint.
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R
2005-01-01
Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities. PMID:15805453
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Variation of Time Domain Failure Probabilities of Jack-up with Wave Return Periods
NASA Astrophysics Data System (ADS)
Idris, Ahmad; Harahap, Indra S. H.; Ali, Montassir Osman Ahmed
2018-04-01
This study evaluated failure probabilities of jack up units on the framework of time dependent reliability analysis using uncertainty from different sea states representing different return period of the design wave. Surface elevation for each sea state was represented by Karhunen-Loeve expansion method using the eigenfunctions of prolate spheroidal wave functions in order to obtain the wave load. The stochastic wave load was propagated on a simplified jack up model developed in commercial software to obtain the structural response due to the wave loading. Analysis of the stochastic response to determine the failure probability in excessive deck displacement in the framework of time dependent reliability analysis was performed by developing Matlab codes in a personal computer. Results from the study indicated that the failure probability increases with increase in the severity of the sea state representing a longer return period. Although the results obtained are in agreement with the results of a study of similar jack up model using time independent method at higher values of maximum allowable deck displacement, it is in contrast at lower values of the criteria where the study reported that failure probability decreases with increase in the severity of the sea state.
ERIC Educational Resources Information Center
Brookhart, Susan M.; And Others
1997-01-01
Process Analysis is described as a method for identifying and measuring the probability of events that could cause the failure of a program, resulting in a cause-and-effect tree structure of events. The method is illustrated through the evaluation of a pilot instructional program at an elementary school. (SLD)
NASA Astrophysics Data System (ADS)
Mulyana, Cukup; Muhammad, Fajar; Saad, Aswad H.; Mariah, Riveli, Nowo
2017-03-01
Storage tank component is the most critical component in LNG regasification terminal. It has the risk of failure and accident which impacts to human health and environment. Risk assessment is conducted to detect and reduce the risk of failure in storage tank. The aim of this research is determining and calculating the probability of failure in regasification unit of LNG. In this case, the failure is caused by Boiling Liquid Expanding Vapor Explosion (BLEVE) and jet fire in LNG storage tank component. The failure probability can be determined by using Fault Tree Analysis (FTA). Besides that, the impact of heat radiation which is generated is calculated. Fault tree for BLEVE and jet fire on storage tank component has been determined and obtained with the value of failure probability for BLEVE of 5.63 × 10-19 and for jet fire of 9.57 × 10-3. The value of failure probability for jet fire is high enough and need to be reduced by customizing PID scheme of regasification LNG unit in pipeline number 1312 and unit 1. The value of failure probability after customization has been obtained of 4.22 × 10-6.
Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin
2017-08-04
In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.
Co-Constructed Failure Narratives in Mathematics Tutoring
ERIC Educational Resources Information Center
DeLiema, David
2017-01-01
The ideas students have about what causes math failure are known to impact motivation. This paper throws light on how attributions of failure are negotiated during math tutoring, between 4th/5th graders and volunteer tutors, at a non-profit STEM-based after-school program. The study employs methods of interaction analysis on a small number of…
ERIC Educational Resources Information Center
Bishop, Matthew J.; Bybee, Taige S.; Lambert, Michael J.; Burlingame, Gary M.; Wells, M. Gawain; Poppleton, Landon E.
2005-01-01
Psychotherapy outcome can be enhanced by early identification of potential treatment failures before they leave treatment. In adults, compelling data are emerging that provide evidence that an early warning system that identifies potential treatment failures can be developed and applied to enhance outcome. The present study reports an analysis of…
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R
2016-09-21
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.
NASA Astrophysics Data System (ADS)
Kuntamalla, Srinivas; Lekkala, Ram Gopal Reddy
2014-10-01
Heart rate variability (HRV) is an important dynamic variable of the cardiovascular system, which operates on multiple time scales. In this study, Multiscale entropy (MSE) analysis is applied to HRV signals taken from Physiobank to discriminate Congestive Heart Failure (CHF) patients from healthy young and elderly subjects. The discrimination power of the MSE method is decreased as the amount of the data reduces and the lowest amount of the data at which there is a clear discrimination between CHF and normal subjects is found to be 4000 samples. Further, this method failed to discriminate CHF from healthy elderly subjects. In view of this, the Reduced Data Dualscale Entropy Analysis method is proposed to reduce the data size required (as low as 500 samples) for clearly discriminating the CHF patients from young and elderly subjects with only two scales. Further, an easy to interpret index is derived using this new approach for the diagnosis of CHF. This index shows 100 % accuracy and correlates well with the pathophysiology of heart failure.
Light water reactor lower head failure analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rempe, J.L.; Chavez, S.A.; Thinnes, G.L.
1993-10-01
This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broadermore » range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response.« less
NASA Technical Reports Server (NTRS)
Bueno, R. A.
1977-01-01
Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.
[Comments on the use of the "life-table method" in orthopedics].
Hassenpflug, J; Hahne, H J; Hedderich, J
1992-01-01
In the description of long term results, e.g. of joint replacements, survivorship analysis is used increasingly in orthopaedic surgery. The survivorship analysis is more useful to describe the frequency of failure rather than global statements in percentage. The relative probability of failure for fixed intervals is drawn from the number of controlled patients and the frequency of failure. The complementary probabilities of success are linked in their temporal sequence thus representing the probability of survival at a fixed endpoint. Necessary condition for the use of this procedure is the exact definition of moment and manner of failure. It is described how to establish survivorship tables.
Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio
2017-01-01
AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027
Tutoring for Success: Empowering Graduate Nurses After Failure on the NCLEX-RN.
Lutter, Stacy L; Thompson, Cheryl W; Condon, Marian C
2017-12-01
Failure on the National Council Licensure Examination for Registered Nurses (NCLEX-RN) is a devastating experience. Most research related to NCLEX-RN is focused on predicting and preventing failure. Despite these efforts, more than 20,000 nursing school graduates experience failure on the NCLEX-RN each year, and there is a paucity of literature regarding remediation after failure. The aim of this article is to describe an individualized tutoring approach centered on establishing a trusting relationship and incorporating two core strategies for remediation: the nugget method, and a six-step strategy for question analysis. This individualized tutoring method has been used by three nursing faculty with a 95% success rate on an NCLEX retake attempt. Further research is needed to identify the elements of this tutoring method that influence success. [J Nurs Educ. 2017;56(12):758-761.]. Copyright 2017, SLACK Incorporated.
Kawakubo, Kazumichi; Kawakami, Hiroshi; Toyokawa, Yoshihide; Otani, Koichi; Kuwatani, Masaki; Abe, Yoko; Kawahata, Shuhei; Kubo, Kimitoshi; Kubota, Yoshimasa; Sakamoto, Naoya
2015-01-01
Endoscopic double self-expandable metallic stent (SEMS) placement by the partial stent-in-stent (PSIS) method has been reported to be useful for the management of unresectable hilar malignant biliary obstruction. However, it is technically challenging, and the optimal SEMS for the procedure remains unknown. The aim of this study was to identify the risk factors for technical failure of endoscopic double SEMS placement for unresectable malignant hilar biliary obstruction (MHBO). Between December 2009 and May 2013, 50 consecutive patients with MHBO underwent endoscopic double SEMS placement by the PSIS method. We retrospectively evaluated the rate of successful double SEMS placement and identified the risk factors for technical failure. The technical success rate for double SEMS placement was 82.0% (95% confidence interval [CI]: 69.2-90.2). On univariate analysis, the rate of technical failure was high in patients with metastatic disease and unilateral placement. Multivariate analysis revealed that metastatic disease was a significant risk factor for technical failure (odds ratio: 9.63, 95% CI: 1.11-105.5). The subgroup analysis after double guidewire insertion showed that the rate of technical success was higher in the laser-cut type SEMS with a large mesh and thick delivery system than in the braided type SEMS with a small mesh and thick delivery system. Metastatic disease was a significant risk factor for technical failure of double SEMS placement for unresectable MHBO. The laser-cut type SEMS with a large mesh and thin delivery system might be preferable for the PSIS procedure. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.
NASA Astrophysics Data System (ADS)
DELİCE, Yavuz
2015-04-01
Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.
Giardina, M; Castiglia, F; Tomarchio, E
2014-12-01
Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.
SU-E-T-627: Failure Modes and Effect Analysis for Monthly Quality Assurance of Linear Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Xiao, Y; Wang, J
2014-06-15
Purpose: To develop and implement a failure mode and effect analysis (FMEA) on routine monthly Quality Assurance (QA) tests (physical tests part) of linear accelerator. Methods: A systematic failure mode and effect analysis method was performed for monthly QA procedures. A detailed process tree of monthly QA was created and potential failure modes were defined. Each failure mode may have many influencing factors. For each factor, a risk probability number (RPN) was calculated from the product of probability of occurrence (O), the severity of effect (S), and detectability of the failure (D). The RPN scores are in a range ofmore » 1 to 1000, with higher scores indicating stronger correlation to a given influencing factor of a failure mode. Five medical physicists in our institution were responsible to discuss and to define the O, S, D values. Results: 15 possible failure modes were identified and all RPN scores of all influencing factors of these 15 failue modes were from 8 to 150, and the checklist of FMEA in monthly QA was drawn. The system showed consistent and accurate response to erroneous conditions. Conclusion: The influencing factors of RPN greater than 50 were considered as highly-correlated factors of a certain out-oftolerance monthly QA test. FMEA is a fast and flexible tool to develop an implement a quality management (QM) frame work of monthly QA, which improved the QA efficiency of our QA team. The FMEA work may incorporate more quantification and monitoring fuctions in future.« less
Fault tree analysis for system modeling in case of intentional EMI
NASA Astrophysics Data System (ADS)
Genender, E.; Mleczko, M.; Döring, O.; Garbe, H.; Potthast, S.
2011-08-01
The complexity of modern systems on the one hand and the rising threat of intentional electromagnetic interference (IEMI) on the other hand increase the necessity for systematical risk analysis. Most of the problems can not be treated deterministically since slight changes in the configuration (source, position, polarization, ...) can dramatically change the outcome of an event. For that purpose, methods known from probabilistic risk analysis can be applied. One of the most common approaches is the fault tree analysis (FTA). The FTA is used to determine the system failure probability and also the main contributors to its failure. In this paper the fault tree analysis is introduced and a possible application of that method is shown using a small computer network as an example. The constraints of this methods are explained and conclusions for further research are drawn.
Analysis for the Progressive Failure Response of Textile Composite Fuselage Frames
NASA Technical Reports Server (NTRS)
Johnson, Eric R.; Boitnott, Richard L. (Technical Monitor)
2002-01-01
A part of aviation accident mitigation is a crashworthy airframe structure, and an important measure of merit for a crashworthy structure is the amount of kinetic energy that can be absorbed in the crush of the structure. Prediction of the energy absorbed from finite element analyses requires modeling the progressive failure sequence. Progressive failure modes may include material degradation, fracture and crack growth, and buckling and collapse. The design of crashworthy airframe components will benefit from progressive failure analyses that have been validated by tests. The subject of this research is the development of a progressive failure analysis for a textile composite, circumferential fuselage frame subjected to a quasi-static, crash-type load. The test data for the frame are reported, and these data are used to develop and to validate methods for the progressive failure response.
Magnezi, Racheli; Hemi, Asaf; Hemi, Rina
2016-01-01
Background Risk management in health care systems applies to all hospital employees and directors as they deal with human life and emergency routines. There is a constant need to decrease risk and increase patient safety in the hospital environment. The purpose of this article is to review the laboratory testing procedures for parathyroid hormone and adrenocorticotropic hormone (which are characterized by short half-lives) and to track failure modes and risks, and offer solutions to prevent them. During a routine quality improvement review at the Endocrine Laboratory in Tel Hashomer Hospital, we discovered these tests are frequently repeated unnecessarily due to multiple failures. The repetition of the tests inconveniences patients and leads to extra work for the laboratory and logistics personnel as well as the nurses and doctors who have to perform many tasks with limited resources. Methods A team of eight staff members accompanied by the Head of the Endocrine Laboratory formed the team for analysis. The failure mode and effects analysis model (FMEA) was used to analyze the laboratory testing procedure and was designed to simplify the process steps and indicate and rank possible failures. Results A total of 23 failure modes were found within the process, 19 of which were ranked by level of severity. The FMEA model prioritizes failures by their risk priority number (RPN). For example, the most serious failure was the delay after the samples were collected from the department (RPN =226.1). Conclusion This model helped us to visualize the process in a simple way. After analyzing the information, solutions were proposed to prevent failures, and a method to completely avoid the top four problems was also developed. PMID:27980440
EVALUATION OF SAFETY IN A RADIATION ONCOLOGY SETTING USING FAILURE MODE AND EFFECTS ANALYSIS
Ford, Eric C.; Gaudette, Ray; Myers, Lee; Vanderver, Bruce; Engineer, Lilly; Zellars, Richard; Song, Danny Y.; Wong, John; DeWeese, Theodore L.
2013-01-01
Purpose Failure mode and effects analysis (FMEA) is a widely used tool for prospectively evaluating safety and reliability. We report our experiences in applying FMEA in the setting of radiation oncology. Methods and Materials We performed an FMEA analysis for our external beam radiation therapy service, which consisted of the following tasks: (1) create a visual map of the process, (2) identify possible failure modes; assign risk probability numbers (RPN) to each failure mode based on tabulated scores for the severity, frequency of occurrence, and detectability, each on a scale of 1 to 10; and (3) identify improvements that are both feasible and effective. The RPN scores can span a range of 1 to 1000, with higher scores indicating the relative importance of a given failure mode. Results Our process map consisted of 269 different nodes. We identified 127 possible failure modes with RPN scores ranging from 2 to 160. Fifteen of the top-ranked failure modes were considered for process improvements, representing RPN scores of 75 and more. These specific improvement suggestions were incorporated into our practice with a review and implementation by each department team responsible for the process. Conclusions The FMEA technique provides a systematic method for finding vulnerabilities in a process before they result in an error. The FMEA framework can naturally incorporate further quantification and monitoring. A general-use system for incident and near miss reporting would be useful in this regard. PMID:19409731
Probability techniques for reliability analysis of composite materials
NASA Technical Reports Server (NTRS)
Wetherhold, Robert C.; Ucci, Anthony M.
1994-01-01
Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossahebi, S; Feigenberg, S; Nichols, E
Purpose: GammaPod™, the first stereotactic radiotherapy device for early stage breast cancer treatment, has been recently installed and commissioned at our institution. A multidisciplinary working group applied the failure mode and effects analysis (FMEA) approach to perform a risk analysis. Methods: FMEA was applied to the GammaPod™ treatment process by: 1) generating process maps for each stage of treatment; 2) identifying potential failure modes and outlining their causes and effects; 3) scoring the potential failure modes using the risk priority number (RPN) system based on the product of severity, frequency of occurrence, and detectability (ranging 1–10). An RPN of highermore » than 150 was set as the threshold for minimal concern of risk. For these high-risk failure modes, potential quality assurance procedures and risk control techniques have been proposed. A new set of severity, occurrence, and detectability values were re-assessed in presence of the suggested mitigation strategies. Results: In the single-day image-and-treat workflow, 19, 22, and 27 sub-processes were identified for the stages of simulation, treatment planning, and delivery processes, respectively. During the simulation stage, 38 potential failure modes were found and scored, in terms of RPN, in the range of 9-392. 34 potential failure modes were analyzed in treatment planning with a score range of 16-200. For the treatment delivery stage, 47 potential failure modes were found with an RPN score range of 16-392. The most critical failure modes consisted of breast-cup pressure loss and incorrect target localization due to patient upper-body alignment inaccuracies. The final RPN score of these failure modes based on recommended actions were assessed to be below 150. Conclusion: FMEA risk analysis technique was applied to the treatment process of GammaPod™, a new stereotactic radiotherapy technology. Application of systematic risk analysis methods is projected to lead to improved quality of GammaPod™ treatments. Ying Niu and Cedric Yu are affiliated with Xcision Medical Systems.« less
Strength and life criteria for corrugated fiberboard by three methods
Thomas J. Urbanik
1997-01-01
The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Structural Reliability Analysis and Optimization: Use of Approximations
NASA Technical Reports Server (NTRS)
Grandhi, Ramana V.; Wang, Liping
1999-01-01
This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different approximations including the higher-order reliability methods (HORM) for representing the failure surface. This report is divided into several parts to emphasize different segments of the structural reliability analysis and design. Broadly, it consists of mathematical foundations, methods and applications. Chapter I discusses the fundamental definitions of the probability theory, which are mostly available in standard text books. Probability density function descriptions relevant to this work are addressed. In Chapter 2, the concept and utility of function approximation are discussed for a general application in engineering analysis. Various forms of function representations and the latest developments in nonlinear adaptive approximations are presented with comparison studies. Research work accomplished in reliability analysis is presented in Chapter 3. First, the definition of safety index and most probable point of failure are introduced. Efficient ways of computing the safety index with a fewer number of iterations is emphasized. In chapter 4, the probability of failure prediction is presented using first-order, second-order and higher-order methods. System reliability methods are discussed in chapter 5. Chapter 6 presents optimization techniques for the modification and redistribution of structural sizes for improving the structural reliability. The report also contains several appendices on probability parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schubert, L; Westerly, D; Vinogradskiy, Y
Purpose: Collisions between treatment equipment and patients are potentially catastrophic. Modern technology now commonly involves automated remote motion during imaging and treatment, yet a systematic assessment to identify and mitigate collision risks has yet to be performed. Failure modes and effects analysis (FMEA) is a method of risk assessment that has been increasingly used in healthcare, yet can be resource intensive. This work presents an efficient approach to FMEA to identify collision risks and implement practical interventions within a modern radiation therapy department. Methods: Potential collisions (e.g. failure modes) were assessed for all treatment and simulation rooms by teams consistingmore » of physicists, therapists, and radiation oncologists. Failure modes were grouped into classes according to similar characteristics. A single group meeting was held to identify implementable interventions for the highest priority classes of failure modes. Results: A total of 60 unique failure modes were identified by 6 different teams of physicists, therapists, and radiation oncologists. Failure modes were grouped into four main classes: specific patient setups, automated equipment motion, manual equipment motion, and actions in QA or service mode. Two of these classes, unusual patient setups and automated machine motion, were identified as being high priority in terms severity of consequence and addressability by interventions. The two highest risk classes consisted of 33 failure modes (55% of the total). In a single one hour group meeting, 6 interventions were identified. Those interventions addressed 100% of the high risk classes of failure modes (55% of all failure modes identified). Conclusion: A class-based approach to FMEA was developed to efficiently identify collision risks and implement interventions in a modern radiation oncology department. Failure modes and interventions will be listed, and a comparison of this approach against traditional FMEA methods will be presented.« less
Least Squares Shadowing sensitivity analysis of chaotic limit cycle oscillations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Qiqi, E-mail: qiqi@mit.edu; Hu, Rui, E-mail: hurui@mit.edu; Blonigan, Patrick, E-mail: blonigan@mit.edu
2014-06-15
The adjoint method, among other sensitivity analysis methods, can fail in chaotic dynamical systems. The result from these methods can be too large, often by orders of magnitude, when the result is the derivative of a long time averaged quantity. This failure is known to be caused by ill-conditioned initial value problems. This paper overcomes this failure by replacing the initial value problem with the well-conditioned “least squares shadowing (LSS) problem”. The LSS problem is then linearized in our sensitivity analysis algorithm, which computes a derivative that converges to the derivative of the infinitely long time average. We demonstrate ourmore » algorithm in several dynamical systems exhibiting both periodic and chaotic oscillations.« less
NASA Technical Reports Server (NTRS)
Sanchez, Christopher M.
2011-01-01
NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.
Seismic analysis for translational failure of landfills with retaining walls.
Feng, Shi-Jin; Gao, Li-Ya
2010-11-01
In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2006-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2007-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
Wang, Peijie; Zhao, Hui; Sun, Jianguo
2016-12-01
Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.
Analysis of middle bearing failure in rotor jet engine using tip-timing and tip-clearance techniques
NASA Astrophysics Data System (ADS)
Rzadkowski, R.; Rokicki, E.; Piechowski, L.; Szczepanik, R.
2016-08-01
The reported problem is the failure of the middle bearing in an aircraft rotor engine. Tip-timing and tip-clearance and variance analyses are carried out on a compressor rotor blade in the seventh stage above the middle bearing. The experimental analyses concern both an aircraft engine with a middle bearing in good working order and an engine with a damaged middle bearing. A numerical analysis of seventh stage blade free vibration is conducted to explain the experimental results. This appears to be an effective method of predicting middle bearing failure. The results show that variance first increases in the initial stages of bearing failure, but then starts to decrease and stabilize, and then again decrease shortly before complete bearing failure.
Analysis for the Progressive Failure Response of Textile Composite Fuselage Frames
NASA Technical Reports Server (NTRS)
Johnson, Eric R.; Boitnott, Richard L. (Technical Monitor)
2002-01-01
A part of aviation accident mitigation is a crash worthy airframe structure, and an important measure of merit for a crash worthy structure is the amount of kinetic energy that can be absorbed in the crush of the structure. Prediction of the energy absorbed from finite element analyses requires modeling the progressive failure sequence. Progressive failure modes may include material degradation, fracture and crack growth, and buckling and collapse. The design of crash worthy airframe components will benefit from progressive failure analyses that have been validated by tests. The subject of this research is the development of a progressive failure analysis for textile composite. circumferential fuselage frames subjected to a quasi-static, crash-type load. The test data for these frames are reported, and these data, along with stub column test data, are to be used to develop and to validate methods for the progressive failure response.
NASA Technical Reports Server (NTRS)
Wolitz, K.; Brockmann, W.; Fischer, T.
1979-01-01
Acoustic emission analysis as a quasi-nondestructive test method makes it possible to differentiate clearly, in judging the total behavior of fiber-reinforced plastic composites, between critical failure modes (in the case of unidirectional composites fiber fractures) and non-critical failure modes (delamination processes or matrix fractures). A particular advantage is that, for varying pressure demands on the composites, the emitted acoustic pulses can be analyzed with regard to their amplitude distribution. In addition, definite indications as to how the damages occurred can be obtained from the time curves of the emitted acoustic pulses as well as from the particular frequency spectrum. Distinct analogies can be drawn between the various analytical methods with respect to whether the failure modes can be classified as critical or non-critical.
Mechanisms of Diagonal-Shear Failure in Reinforced Concrete Beams analyzed by AE-SiGMA
NASA Astrophysics Data System (ADS)
Ohno, Kentaro; Shimozono, Shinichiro; Sawada, Yosuke; Ohtsu, Masayasu
Serious shear failures in reinforced concrete (RC) structures were reported in the Hanshin-Awaji Earthquake. In particular, it was demonstrated that a diagonal-shear failure could lead to disastrous damage. However, mechanisms of the diagonal-shear failure in RC beams have not been completely clarified yet. In this study, the diagonal-shear failure in RC beams is investigated, applying acoustic emission (AE) method. To identify source mechanisms of AE signals, SiGMA (Simplified Green's functions for Moment tensor Analysis) procedure was applied. Prior to four-point bending tests of RC beams, theoretical waveforms were calculated to determine the optimal arrangement of AE sensors. Then, cracking mechanisms in experiments were investigated by applying the SiGMA procedure to AE waveforms. From results of the SiGMA analysis, dominant motions of micro-cracks are found to be of shear crack in all the loading stages. As the load increased, the number of tensile cracks increased and eventually the diagonal-shear failure occurred in the shear span. Prior to final failure, AE cluster of micro-cracks was intensely observed in the shear span. To classify AE sources into tensile and shear cracks, AE parameter analysis was also applied. As a result, most of AE hits are classified into tensile cracks. The difference between results obtained by the AE parameter analysis and by the SiGMA analysis is investigated and discussed.
Practical, transparent prospective risk analysis for the clinical laboratory.
Janssens, Pim Mw
2014-11-01
Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Analysis of rainfall-induced slope instability using a field of local factor of safety
Lu, Ning; Şener-Kaya, Başak; Wayllace, Alexandra; Godt, Jonathan W.
2012-01-01
Slope-stability analyses are mostly conducted by identifying or assuming a potential failure surface and assessing the factor of safety (FS) of that surface. This approach of assigning a single FS to a potentially unstable slope provides little insight on where the failure initiates or the ultimate geometry and location of a landslide rupture surface. We describe a method to quantify a scalar field of FS based on the concept of the Coulomb stress and the shift in the state of stress toward failure that results from rainfall infiltration. The FS at each point within a hillslope is called the local factor of safety (LFS) and is defined as the ratio of the Coulomb stress at the current state of stress to the Coulomb stress of the potential failure state under the Mohr-Coulomb criterion. Comparative assessment with limit-equilibrium and hybrid finite element limit-equilibrium methods show that the proposed LFS is consistent with these approaches and yields additional insight into the geometry and location of the potential failure surface and how instability may initiate and evolve with changes in pore water conditions. Quantitative assessments applying the new LFS field method to slopes under infiltration conditions demonstrate that the LFS has the potential to overcome several major limitations in the classical FS methodologies such as the shape of the failure surface and the inherent underestimation of slope instability. Comparison with infinite-slope methods, including a recent extension to variably saturated conditions, shows further enhancement in assessing shallow landslide occurrence using the LFS methodology. Although we use only a linear elastic solution for the state of stress with no post-failure analysis that require more sophisticated elastoplastic or other theories, the LFS provides a new means to quantify the potential instability zones in hillslopes under variably saturated conditions using stress-field based methods.
An analysis of the value of spermicides in contraception.
1979-11-01
Development of the so-called modern methods of contraception has somewhat eclipsed interest in traditional methods. However, spermicides are still important for many couples and their use appears to be increasing. A brief history of the use of and research into spermicidal contraceptives is presented. The limitations of spermicides are: the necessity for use at the time of intercourse, and their high failure rate. Estimates of the failure rates of spermicides have ranged from .3 pregnancies per 100 woman-years of use to nearly 40, depending on the product used and the population tested. Just as their use depends on various social factors, so does their failure rate. Characteristics of the user deterine failure rates. Motivation is important in lowering failure rates as is education, the intracouple relationship, and previous experience with spermicides. Method failure is also caused by defects in the product, either in the active ingredient of the spermicide or in the base carrier. The main advantage of spermicidal contraception is its safety. Limited research is currently being conducted on spermicides. Areas for improvement in existing spermicides and areas for possible innovation are mentioned.
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Waas, Anthony M.; Berdnarcyk, Brett A.; Arnold, Steven M.; Collier, Craig S.
2009-01-01
This preliminary report demonstrates the capabilities of the recently developed software implementation that links the Generalized Method of Cells to explicit finite element analysis by extending a previous development which tied the generalized method of cells to implicit finite elements. The multiscale framework, which uses explicit finite elements at the global-scale and the generalized method of cells at the microscale is detailed. This implementation is suitable for both dynamic mechanics problems and static problems exhibiting drastic and sudden changes in material properties, which often encounter convergence issues with commercial implicit solvers. Progressive failure analysis of stiffened and un-stiffened fiber-reinforced laminates subjected to normal blast pressure loads was performed and is used to demonstrate the capabilities of this framework. The focus of this report is to document the development of the software implementation; thus, no comparison between the results of the models and experimental data is drawn. However, the validity of the results are assessed qualitatively through the observation of failure paths, stress contours, and the distribution of system energies.
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L.; Terés, Lluís; Baumann, Reinhard R.
2016-01-01
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement. PMID:27649784
Reliability analysis and initial requirements for FC systems and stacks
NASA Astrophysics Data System (ADS)
Åström, K.; Fontell, E.; Virtanen, S.
In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.
NASA Astrophysics Data System (ADS)
Marhadi, Kun Saptohartyadi
Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum structures. A load path definition using a relative compliance change measure (U* field) was demonstrated to be the most useful measure of load path. This measure provides quantitative information on load path trajectories and qualitative information on the effectiveness of the load path. The use of the U* description of load paths in optimizing structures for effective load paths was investigated.
Numerical simulation of failure behavior of granular debris flows based on flume model tests.
Zhou, Jian; Li, Ye-xun; Jia, Min-cai; Li, Cui-na
2013-01-01
In this study, the failure behaviors of debris flows were studied by flume model tests with artificial rainfall and numerical simulations (PFC(3D)). Model tests revealed that grain sizes distribution had profound effects on failure mode, and the failure in slope of medium sand started with cracks at crest and took the form of retrogressive toe sliding failure. With the increase of fine particles in soil, the failure mode of the slopes changed to fluidized flow. The discrete element method PFC(3D) can overcome the hypothesis of the traditional continuous medium mechanic and consider the simple characteristics of particle. Thus, a numerical simulations model considering liquid-solid coupled method has been developed to simulate the debris flow. Comparing the experimental results, the numerical simulation result indicated that the failure mode of the failure of medium sand slope was retrogressive toe sliding, and the failure of fine sand slope was fluidized sliding. The simulation result is consistent with the model test and theoretical analysis, and grain sizes distribution caused different failure behavior of granular debris flows. This research should be a guide to explore the theory of debris flow and to improve the prevention and reduction of debris flow.
Koziol, Mateusz; Figlus, Tomasz
2015-12-14
The work aimed to assess the failure progress in a glass fiber-reinforced polymer laminate with a 3D-woven and (as a comparison) plain-woven reinforcement, during static bending, using acoustic emission signals. The innovative method of the separation of the signal coming from the fiber fracture and the one coming from the matrix fracture with the use of the acoustic event's energy as a criterion was applied. The failure progress during static bending was alternatively analyzed by evaluation of the vibration signal. It gave a possibility to validate the results of the acoustic emission. Acoustic emission, as well as vibration signal analysis proved to be good and effective tools for the registration of failure effects in composite laminates. Vibration analysis is more complicated methodologically, yet it is more precise. The failure progress of the 3D laminate is "safer" and more beneficial than that of the plain-woven laminate. It exhibits less rapid load capacity drops and a higher fiber effort contribution at the moment of the main laminate failure.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
NASA Technical Reports Server (NTRS)
Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.
2009-01-01
This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Failure detection and fault management techniques for flush airdata sensing systems
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Moes, Timothy R.; Leondes, Cornelius T.
1992-01-01
Methods based on chi-squared analysis are presented for detecting system and individual-port failures in the high-angle-of-attack flush airdata sensing system on the NASA F-18 High Alpha Research Vehicle. The HI-FADS hardware is introduced, and the aerodynamic model describes measured pressure in terms of dynamic pressure, angle of attack, angle of sideslip, and static pressure. Chi-squared analysis is described in the presentation of the concept for failure detection and fault management which includes nominal, iteration, and fault-management modes. A matrix of pressure orifices arranged in concentric circles on the nose of the aircraft indicate the parameters which are applied to the regression algorithms. The sensing techniques are applied to the F-18 flight data, and two examples are given of the computed angle-of-attack time histories. The failure-detection and fault-management techniques permit the matrix to be multiply redundant, and the chi-squared analysis is shown to be useful in the detection of failures.
Reliability analysis of the F-8 digital fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goodman, H. A.
1981-01-01
The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.
Fracture Prediction in Plane Elasto-Plastic Problems by the Finite Element Method.
1978-01-01
analysis and testing became an integral part of aircraft design . Fatigue 2 analysis frequently took the form of a damage accumulation theory such as...dictated that any cracking was to be considered a failure. The loss of a U.S. Air Force F-Ill in 1969 initiated a rethinking of airframe design and...analysis concepts. 1 Failure in this aircraft was traced to a small manufactur- ing flaw in a wing pivot fitting, not to a design induced fatigue. In a
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
Top-down and bottom-up definitions of human failure events in human reliability analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald Laurids
2014-10-01
In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less
Tonin, Fernanda S; Piazza, Thais; Wiens, Astrid; Fernandez-Llimos, Fernando; Pontarolo, Roberto
2015-12-01
Objective:We aimed to gather evidence of the discontinuation rates owing to adverse events or treatment failure for four recently approved antipsychotics (asenapine, blonanserin, iloperidone, and lurasidone).Methods: A systematic review followed by pairwise meta-analysis and mixed treatment comparison meta analysis(MTC) was performed, including randomized controlled trials (RCTs) that compared the use of the above-mentioned drugs versus placebo in patients with schizophrenia. An electronic search was conducted in PubMed, Scopus, Science Direct, Scielo, the Cochrane Library, and International Pharmaceutical Abstracts(January 2015). The included trials were at least single blinded. The main outcome measures extracted were discontinuation owing to adverse events and discontinuation owing to treatment failure.Results: Fifteen RCTs were identified (n = 5400 participants) and 13 of them were amenable for use in our meta-analyses. No significant differences were observed between any of the four drugs and placebo as regards discontinuation owing to adverse events, whether in pairwise meta-analysis or in MTC. All drugs presented a better profile than placebo on discontinuation owing to treatment failure, both in pairwise meta-analysis and MTC. Asenapine was found to be the best therapy in terms of tolerability owing to failure,while lurasidone was the worst treatment in terms of adverse events. The evidence around blonanserin is weak.Conclusion: MTCs allowed the creation of two different rank orders of these four antipsychotic drugs in two outcome measures. This evidence-generating method allows direct and indirect comparisons, supporting approval and pricing decisions when lacking sufficient, direct, head-to-head trials.
NASA Technical Reports Server (NTRS)
Vanschalkwyk, Christiaan Mauritz
1991-01-01
Many applications require that a control system must be tolerant to the failure of its components. This is especially true for large space-based systems that must work unattended and with long periods between maintenance. Fault tolerance can be obtained by detecting the failure of the control system component, determining which component has failed, and reconfiguring the system so that the failed component is isolated from the controller. Component failure detection experiments that were conducted on an experimental space structure, the NASA Langley Mini-Mast are presented. Two methodologies for failure detection and isolation (FDI) exist that do not require the specification of failure modes and are applicable to both actuators and sensors. These methods are known as the Failure Detection Filter and the method of Generalized Parity Relations. The latter method was applied to three different sensor types on the Mini-Mast. Failures were simulated in input-output data that were recorded during operation of the Mini-Mast. Both single and double sensor parity relations were tested and the effect of several design parameters on the performance of these relations is discussed. The detection of actuator failures is also treated. It is shown that in all the cases it is possible to identify the parity relations directly from input-output data. Frequency domain analysis is used to explain the behavior of the parity relations.
Failure mode and effect analysis: improving intensive care unit risk management processes.
Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh
2017-04-18
Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.
NiCd cell reliability in the mission environment
NASA Technical Reports Server (NTRS)
Denson, William K.; Klein, Glenn C.
1993-01-01
This paper summarizes an effort by Gates Aerospace Batteries (GAB) and the Reliability Analysis Center (RAC) to analyze survivability data for both General Electric and GAB NiCd cells utilized in various spacecraft. For simplicity sake, all mission environments are described as either low Earth orbital (LEO) or geosynchronous Earth orbit (GEO). 'Extreme value statistical methods' are applied to this database because of the longevity of the numerous missions while encountering relatively few failures. Every attempt was made to include all known instances of cell-induced-failures of the battery and to exclude battery-induced-failures of the cell. While this distinction may be somewhat limited due to availability of in-flight data, we have accepted the learned opinion of the specific customer contacts to ensure integrity of the common databases. This paper advances the preliminary analysis reported upon at the 1991 NASA Battery Workshop. That prior analysis was concerned with an estimated 278 million cell-hours of operation encompassing 183 satellites. The paper also cited 'no reported failures to date.' This analysis reports on 428 million cell hours of operation emcompassing 212 satellites. This analysis also reports on seven 'cell-induced-failures.'
Methods for improved forewarning of critical events across multiple data channels
Hively, Lee M [Philadelphia, TN
2007-04-24
This disclosed invention concerns improvements in forewarning of critical events via phase-space dissimilarity analysis of data from mechanical devices, electrical devices, biomedical data, and other physical processes. First, a single channel of process-indicative data is selected that can be used in place of multiple data channels without sacrificing consistent forewarning of critical events. Second, the method discards data of inadequate quality via statistical analysis of the raw data, because the analysis of poor quality data always yields inferior results. Third, two separate filtering operations are used in sequence to remove both high-frequency and low-frequency artifacts using a zero-phase quadratic filter. Fourth, the method constructs phase-space dissimilarity measures (PSDM) by combining of multi-channel time-serial data into a multi-channel time-delay phase-space reconstruction. Fifth, the method uses a composite measure of dissimilarity (C.sub.i) to provide a forewarning of failure and an indicator of failure onset.
Reliability and cost analysis methods
NASA Technical Reports Server (NTRS)
Suich, Ronald C.
1991-01-01
In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.
Nelson, Stacy; English, Shawn; Briggs, Timothy
2016-05-06
Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheong, S-K; Kim, J
Purpose: The aim of the study is the application of a Failure Modes and Effects Analysis (FMEA) to access the risks for patients undergoing a Low Dose Rate (LDR) Prostate Brachytherapy Treatment. Methods: FMEA was applied to identify all the sub processes involved in the stages of identifying patient, source handling, treatment preparation, treatment delivery, and post treatment. These processes characterize the radiation treatment associated with LDR Prostate Brachytherapy. The potential failure modes together with their causes and effects were identified and ranked in order of their importance. Three indexes were assigned for each failure mode: the occurrence rating (O),more » the severity rating (S), and the detection rating (D). A ten-point scale was used to score each category, ten being the number indicating most severe, most frequent, and least detectable failure mode, respectively. The risk probability number (RPN) was calculated as a product of the three attributes: RPN = O X S x D. The analysis was carried out by a working group (WG) at UPMC. Results: The total of 56 failure modes were identified including 32 modes before the treatment, 13 modes during the treatment, and 11 modes after the treatment. In addition to the protocols already adopted in the clinical practice, the prioritized risk management will be implanted to the high risk procedures on the basis of RPN score. Conclusion: The effectiveness of the FMEA method was established. The FMEA methodology provides a structured and detailed assessment method for the risk analysis of the LDR Prostate Brachytherapy Procedure and can be applied to other radiation treatment modes.« less
a New Method for Fmeca Based on Fuzzy Theory and Expert System
NASA Astrophysics Data System (ADS)
Byeon, Yoong-Tae; Kim, Dong-Jin; Kim, Jin-O.
2008-10-01
Failure Mode Effects and Criticality Analysis (FMECA) is one of most widely used methods in modern engineering system to investigate potential failure modes and its severity upon the system. FMECA evaluates criticality and severity of each failure mode and visualize the risk level matrix putting those indices to column and row variable respectively. Generally, those indices are determined subjectively by experts and operators. However, this process has no choice but to include uncertainty. In this paper, a method for eliciting expert opinions considering its uncertainty is proposed to evaluate the criticality and severity. In addition, a fuzzy expert system is constructed in order to determine the crisp value of risk level for each failure mode. Finally, an illustrative example system is analyzed in the case study. The results are worth considering in deciding the proper policies for each component of the system.
Local Failure in Resected N1 Lung Cancer: Implications for Adjuvant Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higgins, Kristin A., E-mail: kristin.higgins@duke.edu; Chino, Junzo P.; Berry, Mark
2012-06-01
Purpose: To evaluate actuarial rates of local failure in patients with pathologic N1 non-small-cell lung cancer and to identify clinical and pathologic factors associated with an increased risk of local failure after resection. Methods and Materials: All patients who underwent surgery for non-small-cell lung cancer with pathologically confirmed N1 disease at Duke University Medical Center from 1995-2008 were identified. Patients receiving any preoperative therapy or postoperative radiotherapy or with positive surgical margins were excluded. Local failure was defined as disease recurrence within the ipsilateral hilum, mediastinum, or bronchial stump/staple line. Actuarial rates of local failure were calculated with the Kaplan-Meiermore » method. A Cox multivariate analysis was used to identify factors independently associated with a higher risk of local recurrence. Results: Among 1,559 patients who underwent surgery during the time interval, 198 met the inclusion criteria. Of these patients, 50 (25%) received adjuvant chemotherapy. Actuarial (5-year) rates of local failure, distant failure, and overall survival were 40%, 55%, and 33%, respectively. On multivariate analysis, factors associated with an increased risk of local failure included a video-assisted thoracoscopic surgery approach (hazard ratio [HR], 2.5; p = 0.01), visceral pleural invasion (HR, 2.1; p = 0.04), and increasing number of positive N1 lymph nodes (HR, 1.3 per involved lymph node; p = 0.02). Chemotherapy was associated with a trend toward decreased risk of local failure that was not statistically significant (HR, 0.61; p = 0.2). Conclusions: Actuarial rates of local failure in pN1 disease are high. Further investigation of conformal postoperative radiotherapy may be warranted.« less
Compression After Impact on Honeycomb Core Sandwich Panels with Thin Facesheets, Part 2: Analysis
NASA Technical Reports Server (NTRS)
Mcquigg, Thomas D.; Kapania, Rakesh K.; Scotti, Stephen J.; Walker, Sandra P.
2012-01-01
A two part research study has been completed on the topic of compression after impact (CAI) of thin facesheet honeycomb core sandwich panels. The research has focused on both experiments and analysis in an effort to establish and validate a new understanding of the damage tolerance of these materials. Part 2, the subject of the current paper, is focused on the analysis, which corresponds to the CAI testings described in Part 1. Of interest, are sandwich panels, with aerospace applications, which consist of very thin, woven S2-fiberglass (with MTM45-1 epoxy) facesheets adhered to a Nomex honeycomb core. Two sets of materials, which were identical with the exception of the density of the honeycomb core, were tested in Part 1. The results highlighted the need for analysis methods which taken into account multiple failure modes. A finite element model (FEM) is developed here, in Part 2. A commercial implementation of the Multicontinuum Failure Theory (MCT) for progressive failure analysis (PFA) in composite laminates, Helius:MCT, is included in this model. The inclusion of PFA in the present model provided a new, unique ability to account for multiple failure modes. In addition, significant impact damage detail is included in the model. A sensitivity study, used to assess the effect of each damage parameter on overall analysis results, is included in an appendix. Analysis results are compared to the experimental results for each of the 32 CAI sandwich panel specimens tested to failure. The failure of each specimen is predicted using the high-fidelity, physicsbased analysis model developed here, and the results highlight key improvements in the understanding of honeycomb core sandwich panel CAI failure. Finally, a parametric study highlights the strength benefits compared to mass penalty for various core densities.
Finite element based damage assessment of composite tidal turbine blades
NASA Astrophysics Data System (ADS)
Fagan, Edward M.; Leen, Sean B.; Kennedy, Ciaran R.; Goggins, Jamie
2015-07-01
With significant interest growing in the ocean renewables sector, horizontal axis tidal current turbines are in a position to dominate the marketplace. The test devices that have been placed in operation so far have suffered from premature failures, caused by difficulties with structural strength prediction. The goal of this work is to develop methods of predicting the damage level in tidal turbines under their maximum operating tidal velocity. The analysis was conducted using the finite element software package Abaqus; shell models of three representative tidal turbine blades are produced. Different construction methods will affect the damage level in the blade and for this study models were developed with varying hydrofoil profiles. In order to determine the risk of failure, a user material subroutine (UMAT) was created. The UMAT uses the failure criteria designed by Alfred Puck to calculate the risk of fibre and inter-fibre failure in the blades. The results show that degradation of the stiffness is predicted for the operating conditions, having an effect on the overall tip deflection. The failure criteria applied via the UMAT form a useful tool for analysis of high risk regions within the blade designs investigated.
Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis
NASA Technical Reports Server (NTRS)
Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.
2005-01-01
NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.
Why Students Fail at Volumetric Analysis.
ERIC Educational Resources Information Center
Pickering, Miles
1979-01-01
Investigates the reasons for students' failure in an introductory volumetric analysis course by analyzing test papers and judging them against a hypothetical ideal method of grading laboratory techniques. (GA)
NASA Astrophysics Data System (ADS)
Han, Ru
This thesis focuses on the analysis of dispersed phase reinforced composite materials with perfect as well as imperfect interfaces using the Boundary Element Method (BEM). Two problems of interest are considered, namely, to determine the limitations in the use of effective properties and the analysis of failure progression at the inclusion-matrix interface. The effective moduli (effective Young's modulus, effective Poisson's ratio, effective shear modulus, and effective bulk modulus) of composite materials can be determined at the mesoscopic level using three-dimensional parallel BEM simulations. By comparing the mesoscopic BEM results and the macroscopic results based on effective properties, limitations in the effective property approach can be determined. Decohesion is an important failure mode associated with fiber-reinforced composite materials. Analysis of failure progression at the fiber-matrix interface in fiber-reinforced composite materials is considered using a softening decohesion model consistent with thermodynamic concepts. In this model, the initiation of failure is given directly by a failure criterion. Damage is interpreted by the development of a discontinuity of displacement. The formulation describing the potential development of damage is governed by a discrete decohesive constitutive equation. Numerical simulations are performed using the direct boundary element method. Incremental decohesion simulations illustrate the progressive evolution of debonding zones and the propagation of cracks along the interfaces. The effect of decohesion on the macroscopic response of composite materials is also investigated.
Quantitative Approach to Failure Mode and Effect Analysis for Linear Accelerator Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Daniel, Jennifer C., E-mail: jennifer.odaniel@duke.edu; Yin, Fang-Fang
Purpose: To determine clinic-specific linear accelerator quality assurance (QA) TG-142 test frequencies, to maximize physicist time efficiency and patient treatment quality. Methods and Materials: A novel quantitative approach to failure mode and effect analysis is proposed. Nine linear accelerator-years of QA records provided data on failure occurrence rates. The severity of test failure was modeled by introducing corresponding errors into head and neck intensity modulated radiation therapy treatment plans. The relative risk of daily linear accelerator QA was calculated as a function of frequency of test performance. Results: Although the failure severity was greatest for daily imaging QA (imaging vsmore » treatment isocenter and imaging positioning/repositioning), the failure occurrence rate was greatest for output and laser testing. The composite ranking results suggest that performing output and lasers tests daily, imaging versus treatment isocenter and imaging positioning/repositioning tests weekly, and optical distance indicator and jaws versus light field tests biweekly would be acceptable for non-stereotactic radiosurgery/stereotactic body radiation therapy linear accelerators. Conclusions: Failure mode and effect analysis is a useful tool to determine the relative importance of QA tests from TG-142. Because there are practical time limitations on how many QA tests can be performed, this analysis highlights which tests are the most important and suggests the frequency of testing based on each test's risk priority number.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L; Symons, Christopher T
2011-01-01
Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learningmore » system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.« less
Acoustic emission and nondestructive evaluation of biomaterials and tissues.
Kohn, D H
1995-01-01
Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.
Dynamically induced cascading failures in power grids.
Schäfer, Benjamin; Witthaut, Dirk; Timme, Marc; Latora, Vito
2018-05-17
Reliable functioning of infrastructure networks is essential for our modern society. Cascading failures are the cause of most large-scale network outages. Although cascading failures often exhibit dynamical transients, the modeling of cascades has so far mainly focused on the analysis of sequences of steady states. In this article, we focus on electrical transmission networks and introduce a framework that takes into account both the event-based nature of cascades and the essentials of the network dynamics. We find that transients of the order of seconds in the flows of a power grid play a crucial role in the emergence of collective behaviors. We finally propose a forecasting method to identify critical lines and components in advance or during operation. Overall, our work highlights the relevance of dynamically induced failures on the synchronization dynamics of national power grids of different European countries and provides methods to predict and model cascading failures.
Radiographic methods of wear analysis in total hip arthroplasty.
Rahman, Luthfur; Cobb, Justin; Muirhead-Allwood, Sarah
2012-12-01
Polyethylene wear is an important factor in failure of total hip arthroplasty (THA). With increasing numbers of THAs being performed worldwide, particularly in younger patients, the burden of failure and revision arthroplasty is increasing, as well, along with associated costs and workload. Various radiographic methods of measuring polyethylene wear have been developed to assist in deciding when to monitor patients more closely and when to consider revision surgery. Radiographic methods that have been developed to measure polyethylene wear include manual and computer-assisted plain radiography, two- and three-dimensional techniques, and radiostereometric analysis. Some of these methods are important in both clinical and research settings. CT has the potential to provide additional information on component orientation and enables assessment of periprosthetic osteolysis, which is an important consequence of polyethylene wear.
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R E
2005-04-01
Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities.
Platek, Mary E.; Reid, Mary E.; Wilding, Gregory E.; Jaggernauth, Wainwright; Rigual, Nestor R.; Hicks, Wesley L.; Popat, Saurin R.; Warren, Graham W.; Sullivan, Maureen; Thorstad, Wade L.; Khan, Mohamed K.; Loree, Thom R.; Singh, Anurag K.
2015-01-01
Background This study was carried out to determine if markers of nutritional status predict for locoregional failure following intensity-modulated radiation therapy (IMRT) with concurrent chemoradiotherapy (CCRT) for squamous cell carcinoma of the head and neck (SCCHN). Methods We performed a retrospective chart review of 78 patients with SCCHN who received definitive CCRT. We compared patient factors, tumor characteristics, and nutritional status indicators between patients with and without locoregional failure. Results Fifteen of 78 patients (19%) experienced locoregional failure. Median follow-up for live patients was 38 months. On univariate analysis, pretreatment percentage of ideal body weight (%IBW) (p < .01), pretreatment hemoglobin (p = .04), and treatment duration (p < .01) were significant predictors of failure. On multivariate analysis, pretreatment %IBW (p = .04) and treatment time (p < .01) remained statistically significant. Conclusions Although treatment time is an accepted risk factor for failure, differences in outcome for patients with head and neck cancer undergoing definitive CCRT based on pre-treatment %IBW should be examined further. PMID:21990220
Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders
NASA Technical Reports Server (NTRS)
Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)
2002-01-01
A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.
Electrical failure debug using interlayer profiling method
NASA Astrophysics Data System (ADS)
Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh
2017-03-01
It is very well known that as technology nodes move to smaller sizes, the number of design rules increases while design structures become more regular and the process manufacturing steps have increased as well. Normal inspection tools can only monitor hard failures on a single layer. For electrical failures that happen due to inter layers misalignments, we can only detect them through testing. This paper will present a working flow for using pattern analysis interlayer profiling techniques to turn multiple layer physical info into group linked parameter values. Using this data analysis flow combined with an electrical model allows us to find critical regions on a layout for yield learning.
Failure analysis of thick composite cylinders under external pressure
NASA Technical Reports Server (NTRS)
Caiazzo, A.; Rosen, B. W.
1992-01-01
Failure of thick section composites due to local compression strength and overall structural instability is treated. Effects of material nonlinearity, imperfect fiber architecture, and structural imperfections upon anticipated failure stresses are determined. Comparisons with experimental data for a series of test cylinders are described. Predicting the failure strength of composite structures requires consideration of stability and material strength modes of failure using linear and nonlinear analysis techniques. Material strength prediction requires the accurate definition of the local multiaxial stress state in the material. An elasticity solution for the linear static analysis of thick anisotropic cylinders and rings is used herein to predict the axisymmetric stress state in the cylinders. Asymmetric nonlinear behavior due to initial cylinder out of roundness and the effects of end closure structure are treated using finite element methods. It is assumed that local fiber or ply waviness is an important factor in the initiation of material failure. An analytical model for the prediction of compression failure of fiber composites, which includes the effects of fiber misalignments, matrix inelasticity, and multiaxial applied stresses is used for material strength calculations. Analytical results are compared to experimental data for a series of glass and carbon fiber reinforced epoxy cylinders subjected to external pressure. Recommendations for pretest characterization and other experimental issues are presented. Implications for material and structural design are discussed.
Application of failure mode and effect analysis in an assisted reproduction technology laboratory.
Intra, Giulia; Alteri, Alessandra; Corti, Laura; Rabellotti, Elisa; Papaleo, Enrico; Restelli, Liliana; Biondo, Stefania; Garancini, Maria Paola; Candiani, Massimo; Viganò, Paola
2016-08-01
Assisted reproduction technology laboratories have a very high degree of complexity. Mismatches of gametes or embryos can occur, with catastrophic consequences for patients. To minimize the risk of error, a multi-institutional working group applied failure mode and effects analysis (FMEA) to each critical activity/step as a method of risk assessment. This analysis led to the identification of the potential failure modes, together with their causes and effects, using the risk priority number (RPN) scoring system. In total, 11 individual steps and 68 different potential failure modes were identified. The highest ranked failure modes, with an RPN score of 25, encompassed 17 failures and pertained to "patient mismatch" and "biological sample mismatch". The maximum reduction in risk, with RPN reduced from 25 to 5, was mostly related to the introduction of witnessing. The critical failure modes in sample processing were improved by 50% in the RPN by focusing on staff training. Three indicators of FMEA success, based on technical skill, competence and traceability, have been evaluated after FMEA implementation. Witnessing by a second human operator should be introduced in the laboratory to avoid sample mix-ups. These findings confirm that FMEA can effectively reduce errors in assisted reproduction technology laboratories. Copyright © 2016 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Application of failure mode and effect analysis in a radiology department.
Thornton, Eavan; Brook, Olga R; Mendiratta-Lala, Mishal; Hallett, Donna T; Kruskal, Jonathan B
2011-01-01
With increasing deployment, complexity, and sophistication of equipment and related processes within the clinical imaging environment, system failures are more likely to occur. These failures may have varying effects on the patient, ranging from no harm to devastating harm. Failure mode and effect analysis (FMEA) is a tool that permits the proactive identification of possible failures in complex processes and provides a basis for continuous improvement. This overview of the basic principles and methodology of FMEA provides an explanation of how FMEA can be applied to clinical operations in a radiology department to reduce, predict, or prevent errors. The six sequential steps in the FMEA process are explained, and clinical magnetic resonance imaging services are used as an example for which FMEA is particularly applicable. A modified version of traditional FMEA called Healthcare Failure Mode and Effect Analysis, which was introduced by the U.S. Department of Veterans Affairs National Center for Patient Safety, is briefly reviewed. In conclusion, FMEA is an effective and reliable method to proactively examine complex processes in the radiology department. FMEA can be used to highlight the high-risk subprocesses and allows these to be targeted to minimize the future occurrence of failures, thus improving patient safety and streamlining the efficiency of the radiology department. RSNA, 2010
Small vulnerable sets determine large network cascades in power grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Yang; Nishikawa, Takashi; Motter, Adilson E.
The understanding of cascading failures in complex systems has been hindered by the lack of realistic large-scale modeling and analysis that can account for variable system conditions. By using the North American power grid, we identified, quantified, and analyzed the set of network components that are vulnerable to cascading failures under any out of multiple conditions. We show that the vulnerable set consists of a small but topologically central portion of the network and that large cascades are disproportionately more likely to be triggered by initial failures close to this set. These results elucidate aspects of the origins and causesmore » of cascading failures relevant for grid design and operation and demonstrate vulnerability analysis methods that are applicable to a wider class of cascade-prone networks.« less
Small vulnerable sets determine large network cascades in power grids
Yang, Yang; Nishikawa, Takashi; Motter, Adilson E.
2017-11-17
The understanding of cascading failures in complex systems has been hindered by the lack of realistic large-scale modeling and analysis that can account for variable system conditions. By using the North American power grid, we identified, quantified, and analyzed the set of network components that are vulnerable to cascading failures under any out of multiple conditions. We show that the vulnerable set consists of a small but topologically central portion of the network and that large cascades are disproportionately more likely to be triggered by initial failures close to this set. These results elucidate aspects of the origins and causesmore » of cascading failures relevant for grid design and operation and demonstrate vulnerability analysis methods that are applicable to a wider class of cascade-prone networks.« less
A Selection of Composites Simulation Practices at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.
2007-01-01
One of the major areas of study at NASA Langley Research Center is the development of technologies that support the use of advanced composite materials in aerospace applications. Amongst the supporting technologies are analysis tools used to simulate the behavior of these materials. This presentation will discuss a number of examples of analysis tools and simulation practices conducted at NASA Langley. The presentation will include examples of damage tolerance analyses for both interlaminar and intralaminar failure modes. Tools for modeling interlaminar failure modes include fracture mechanics and cohesive methods, whilst tools for modeling intralaminar failure involve the development of various progressive failure analyses. Other examples of analyses developed at NASA Langley include a thermo-mechanical model of an orthotropic material and the simulation of delamination growth in z-pin reinforced laminates.
Pushover analysis of reinforced concrete frames considering shear failure at beam-column joints
NASA Astrophysics Data System (ADS)
Sung, Y. C.; Lin, T. K.; Hsiao, C. C.; Lai, M. C.
2013-09-01
Since most current seismic capacity evaluations of reinforced concrete (RC) frame structures are implemented by either static pushover analysis (PA) or dynamic time history analysis, with diverse settings of the plastic hinges (PHs) on such main structural components as columns, beams and walls, the complex behavior of shear failure at beam-column joints (BCJs) during major earthquakes is commonly neglected. This study proposes new nonlinear PA procedures that consider shear failure at BCJs and seek to assess the actual damage to RC structures. Based on the specifications of FEMA-356, a simplified joint model composed of two nonlinear cross struts placed diagonally over the location of the plastic hinge is established, allowing a sophisticated PA to be performed. To verify the validity of this method, the analytical results for the capacity curves and the failure mechanism derived from three different full-size RC frames are compared with the experimental measurements. By considering shear failure at BCJs, the proposed nonlinear analytical procedures can be used to estimate the structural behavior of RC frames, including seismic capacity and the progressive failure sequence of joints, in a precise and effective manner.
Small sample estimation of the reliability function for technical products
NASA Astrophysics Data System (ADS)
Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.
2017-12-01
It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.
rpsftm: An R Package for Rank Preserving Structural Failure Time Models
Allison, Annabel; White, Ian R; Bond, Simon
2018-01-01
Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ, is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z(ψ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm. PMID:29564164
rpsftm: An R Package for Rank Preserving Structural Failure Time Models.
Allison, Annabel; White, Ian R; Bond, Simon
2017-12-04
Treatment switching in a randomised controlled trial occurs when participants change from their randomised treatment to the other trial treatment during the study. Failure to account for treatment switching in the analysis (i.e. by performing a standard intention-to-treat analysis) can lead to biased estimates of treatment efficacy. The rank preserving structural failure time model (RPSFTM) is a method used to adjust for treatment switching in trials with survival outcomes. The RPSFTM is due to Robins and Tsiatis (1991) and has been developed by White et al. (1997, 1999). The method is randomisation based and uses only the randomised treatment group, observed event times, and treatment history in order to estimate a causal treatment effect. The treatment effect, ψ , is estimated by balancing counter-factual event times (that would be observed if no treatment were received) between treatment groups. G-estimation is used to find the value of ψ such that a test statistic Z ( ψ ) = 0. This is usually the test statistic used in the intention-to-treat analysis, for example, the log rank test statistic. We present an R package that implements the method of rpsftm.
Recent advances in computational structural reliability analysis methods
NASA Astrophysics Data System (ADS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Recent advances in computational structural reliability analysis methods
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-01-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
NASA Astrophysics Data System (ADS)
Liu, Hu-Chen; Liu, Long; Li, Ping
2014-10-01
Failure mode and effects analysis (FMEA) has shown its effectiveness in examining potential failures in products, process, designs or services and has been extensively used for safety and reliability analysis in a wide range of industries. However, its approach to prioritise failure modes through a crisp risk priority number (RPN) has been criticised as having several shortcomings. The aim of this paper is to develop an efficient and comprehensive risk assessment methodology using intuitionistic fuzzy hybrid weighted Euclidean distance (IFHWED) operator to overcome the limitations and improve the effectiveness of the traditional FMEA. The diversified and uncertain assessments given by FMEA team members are treated as linguistic terms expressed in intuitionistic fuzzy numbers (IFNs). Intuitionistic fuzzy weighted averaging (IFWA) operator is used to aggregate the FMEA team members' individual assessments into a group assessment. IFHWED operator is applied thereafter to the prioritisation and selection of failure modes. Particularly, both subjective and objective weights of risk factors are considered during the risk evaluation process. A numerical example for risk assessment is given to illustrate the proposed method finally.
Intergranular degradation assessment via random grain boundary network analysis
Kumar, Mukul; Schwartz, Adam J.; King, Wayne E.
2002-01-01
A method is disclosed for determining the resistance of polycrystalline materials to intergranular degradation or failure (IGDF), by analyzing the random grain boundary network connectivity (RGBNC) microstructure. Analysis of the disruption of the RGBNC microstructure may be assess the effectiveness of materials processing in increasing IGDF resistance. Comparison of the RGBNC microstructures of materials exposed to extreme operating conditions to unexposed materials may be used to diagnose and predict possible onset of material failure due to
High Speed Dynamics in Brittle Materials
NASA Astrophysics Data System (ADS)
Hiermaier, Stefan
2015-06-01
Brittle Materials under High Speed and Shock loading provide a continuous challenge in experimental physics, analysis and numerical modelling, and consequently for engineering design. The dependence of damage and fracture processes on material-inherent length and time scales, the influence of defects, rate-dependent material properties and inertia effects on different scales make their understanding a true multi-scale problem. In addition, it is not uncommon that materials show a transition from ductile to brittle behavior when the loading rate is increased. A particular case is spallation, a brittle tensile failure induced by the interaction of stress waves leading to a sudden change from compressive to tensile loading states that can be invoked in various materials. This contribution highlights typical phenomena occurring when brittle materials are exposed to high loading rates in applications such as blast and impact on protective structures, or meteorite impact on geological materials. A short review on experimental methods that are used for dynamic characterization of brittle materials will be given. A close interaction of experimental analysis and numerical simulation has turned out to be very helpful in analyzing experimental results. For this purpose, adequate numerical methods are required. Cohesive zone models are one possible method for the analysis of brittle failure as long as some degree of tension is present. Their recent successful application for meso-mechanical simulations of concrete in Hopkinson-type spallation tests provides new insight into the dynamic failure process. Failure under compressive loading is a particular challenge for numerical simulations as it involves crushing of material which in turn influences stress states in other parts of a structure. On a continuum scale, it can be modeled using more or less complex plasticity models combined with failure surfaces, as will be demonstrated for ceramics. Models which take microstructural cracking directly into account may provide a more physics-based approach for compressive failure in the future.
SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harry, T; University of California, San Diego, La Jolla, CA; Manger, R
Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this workmore » was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.« less
NASA Astrophysics Data System (ADS)
Anggraeni, Novia Antika
2015-04-01
The test of eruption time prediction is an effort to prepare volcanic disaster mitigation, especially in the volcano's inhabited slope area, such as Merapi Volcano. The test can be conducted by observing the increase of volcanic activity, such as seismicity degree, deformation and SO2 gas emission. One of methods that can be used to predict the time of eruption is Materials Failure Forecast Method (FFM). Materials Failure Forecast Method (FFM) is a predictive method to determine the time of volcanic eruption which was introduced by Voight (1988). This method requires an increase in the rate of change, or acceleration of the observed volcanic activity parameters. The parameter used in this study is the seismic energy value of Merapi Volcano from 1990 - 2012. The data was plotted in form of graphs of seismic energy rate inverse versus time with FFM graphical technique approach uses simple linear regression. The data quality control used to increase the time precision employs the data correlation coefficient value of the seismic energy rate inverse versus time. From the results of graph analysis, the precision of prediction time toward the real time of eruption vary between -2.86 up to 5.49 days.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anggraeni, Novia Antika, E-mail: novia.antika.a@gmail.com
The test of eruption time prediction is an effort to prepare volcanic disaster mitigation, especially in the volcano’s inhabited slope area, such as Merapi Volcano. The test can be conducted by observing the increase of volcanic activity, such as seismicity degree, deformation and SO2 gas emission. One of methods that can be used to predict the time of eruption is Materials Failure Forecast Method (FFM). Materials Failure Forecast Method (FFM) is a predictive method to determine the time of volcanic eruption which was introduced by Voight (1988). This method requires an increase in the rate of change, or acceleration ofmore » the observed volcanic activity parameters. The parameter used in this study is the seismic energy value of Merapi Volcano from 1990 – 2012. The data was plotted in form of graphs of seismic energy rate inverse versus time with FFM graphical technique approach uses simple linear regression. The data quality control used to increase the time precision employs the data correlation coefficient value of the seismic energy rate inverse versus time. From the results of graph analysis, the precision of prediction time toward the real time of eruption vary between −2.86 up to 5.49 days.« less
Design and implementation of a novel mechanical testing system for cellular solids.
Nazarian, Ara; Stauber, Martin; Müller, Ralph
2005-05-01
Cellular solids constitute an important class of engineering materials encompassing both man-made and natural constructs. Materials such as wood, cork, coral, and cancellous bone are examples of cellular solids. The structural analysis of cellular solid failure has been limited to 2D sections to illustrate global fracture patterns. Due to the inherent destructiveness of 2D methods, dynamic assessment of fracture progression has not been possible. Image-guided failure assessment (IGFA), a noninvasive technique to analyze 3D progressive bone failure, has been developed utilizing stepwise microcompression in combination with time-lapsed microcomputed tomographic imaging (microCT). This method allows for the assessment of fracture progression in the plastic region, where much of the structural deformation/energy absorption is encountered in a cellular solid. Therefore, the goal of this project was to design and fabricate a novel micromechanical testing system to validate the effectiveness of the stepwise IGFA technique compared to classical continuous mechanical testing, using a variety of engineered and natural cellular solids. In our analysis, we found stepwise compression to be a valid approach for IGFA with high precision and accuracy comparable to classical continuous testing. Therefore, this approach complements the conventional mechanical testing methods by providing visual insight into the failure propagation mechanisms of cellular solids. (c) 2005 Wiley Periodicals, Inc.
Landolina, Maurizio; Marzegalli, Maurizio; Lunati, Maurizio; Perego, Giovanni B; Guenzati, Giuseppe; Curnis, Antonio; Valsecchi, Sergio; Borghetti, Francesca; Borghi, Gabriella; Masella, Cristina
2013-01-01
Background Heart failure patients with implantable defibrillators place a significant burden on health care systems. Remote monitoring allows assessment of device function and heart failure parameters, and may represent a safe, effective, and cost-saving method compared to conventional in-office follow-up. Objective We hypothesized that remote device monitoring represents a cost-effective approach. This paper summarizes the economic evaluation of the Evolution of Management Strategies of Heart Failure Patients With Implantable Defibrillators (EVOLVO) study, a multicenter clinical trial aimed at measuring the benefits of remote monitoring for heart failure patients with implantable defibrillators. Methods Two hundred patients implanted with a wireless transmission–enabled implantable defibrillator were randomized to receive either remote monitoring or the conventional method of in-person evaluations. Patients were followed for 16 months with a protocol of scheduled in-office and remote follow-ups. The economic evaluation of the intervention was conducted from the perspectives of the health care system and the patient. A cost-utility analysis was performed to measure whether the intervention was cost-effective in terms of cost per quality-adjusted life year (QALY) gained. Results Overall, remote monitoring did not show significant annual cost savings for the health care system (€1962.78 versus €2130.01; P=.80). There was a significant reduction of the annual cost for the patients in the remote arm in comparison to the standard arm (€291.36 versus €381.34; P=.01). Cost-utility analysis was performed for 180 patients for whom QALYs were available. The patients in the remote arm gained 0.065 QALYs more than those in the standard arm over 16 months, with a cost savings of €888.10 per patient. Results from the cost-utility analysis of the EVOLVO study show that remote monitoring is a cost-effective and dominant solution. Conclusions Remote management of heart failure patients with implantable defibrillators appears to be cost-effective compared to the conventional method of in-person evaluations. Trial Registration ClinicalTrials.gov NCT00873899; http://clinicaltrials.gov/show/NCT00873899 (Archived by WebCite at http://www.webcitation.org/6H0BOA29f). PMID:23722666
Crack propagation of brittle rock under high geostress
NASA Astrophysics Data System (ADS)
Liu, Ning; Chu, Weijiang; Chen, Pingzhi
2018-03-01
Based on fracture mechanics and numerical methods, the characteristics and failure criterions of wall rock cracks including initiation, propagation, and coalescence are analyzed systematically under different conditions. In order to consider the interaction among cracks, adopt the sliding model of multi-cracks to simulate the splitting failure of rock in axial compress. The reinforcement of bolts and shotcrete supporting to rock mass can control the cracks propagation well. Adopt both theory analysis and simulation method to study the mechanism of controlling the propagation. The best fixed angle of bolts is calculated. Then use ansys to simulate the crack arrest function of bolt to crack. Analyze the influence of different factors on stress intensity factor. The method offer more scientific and rational criterion to evaluate the splitting failure of underground engineering under high geostress.
Analysis of a Memory Device Failure
NASA Technical Reports Server (NTRS)
Nicolas, David P.; Devaney, John; Gores, Mark; Dicken, Howard
1998-01-01
The recent failure of a vintage memory device presented a unique challenge to failure analysts. Normally device layouts, fabrication parameters and other technical information were available to assist the analyst in the analysis. However, this device was out of production for many years and the manufacturer was no longer in business, so the information was not available. To further complicate this analysis, the package leads were all but removed making additional electrical testing difficult. Under these conditions, new and innovative methods were used to analyze the failure. The external visual exam, radiography, PIND, and leak testing were performed with nominal results. Since electrical testing was precluded by the short lead lengths, the device was delidded to expose the internal structures for microscopic examination. No failure mechanism was identified. The available electrical data suggested an ESD or low level EOS type mechanism which left no visible surface damage. Due to parallel electrical paths, electrical probing on the chip failed to locate the failure site. Two non-destructive Scanning Electron Microscopy techniques, CIVA (Charge Induced Voltage Alteration) and EBIC (Electron Beam Induced Current), and a liquid crystal decoration technique which detects localized heating were employed to aid in the analysis. CIVA and EBIC isolated two faults in the input circuitry, and the liquid crystal technique further localized two hot spots in regions on two input gates. Removal of the glassivation and metallization revealed multiple failure sites located in the gate oxide of two input transistors suggesting machine (testing) induced damage.
NASA Astrophysics Data System (ADS)
Meng, Xiaocheng; Che, Renfei; Gao, Shi; He, Juntao
2018-04-01
With the advent of large data age, power system research has entered a new stage. At present, the main application of large data in the power system is the early warning analysis of the power equipment, that is, by collecting the relevant historical fault data information, the system security is improved by predicting the early warning and failure rate of different kinds of equipment under certain relational factors. In this paper, a method of line failure rate warning is proposed. Firstly, fuzzy dynamic clustering is carried out based on the collected historical information. Considering the imbalance between the attributes, the coefficient of variation is given to the corresponding weights. And then use the weighted fuzzy clustering to deal with the data more effectively. Then, by analyzing the basic idea and basic properties of the relational analysis model theory, the gray relational model is improved by combining the slope and the Deng model. And the incremental composition and composition of the two sequences are also considered to the gray relational model to obtain the gray relational degree between the various samples. The failure rate is predicted according to the principle of weighting. Finally, the concrete process is expounded by an example, and the validity and superiority of the proposed method are verified.
A Case Study on Engineering Failure Analysis of Link Chain
Lee, Seong-Beom; Lee, Hong-Chul
2010-01-01
Objectives The objective of this study was to investigate the effect of chain installation condition on stress distribution that could eventually cause disastrous failure from sudden deformation and geometric rupture. Methods Fractographic method used for the failed chain indicates that over-stress was considered as the root cause of failure. 3D modeling and finite element analysis for the chain, used in a crane hook, were performed with a three-dimensional interactive application program, CATIA, commercial finite element analysis and computational fluid dynamic software, ANSYS. Results The results showed that the state of stress was changed depending on the initial position of the chain that was installed in the hook. Especially, the magnitude of the stress was strongly affected by the bending forces, which are 2.5 times greater (under the simulation condition currently investigated) than that from the plain tensile load. Also, it was noted that the change of load state is strongly related to the failure of parts. The chain can hold an ultimate load of about 8 tons with only the tensile load acting on it. Conclusion The conclusions of this research clearly showed that a reduction of the loss from similar incidents can be achieved when an operator properly handles the installation of the chain. PMID:22953162
Zanaboni, Paolo; Landolina, Maurizio; Marzegalli, Maurizio; Lunati, Maurizio; Perego, Giovanni B; Guenzati, Giuseppe; Curnis, Antonio; Valsecchi, Sergio; Borghetti, Francesca; Borghi, Gabriella; Masella, Cristina
2013-05-30
Heart failure patients with implantable defibrillators place a significant burden on health care systems. Remote monitoring allows assessment of device function and heart failure parameters, and may represent a safe, effective, and cost-saving method compared to conventional in-office follow-up. We hypothesized that remote device monitoring represents a cost-effective approach. This paper summarizes the economic evaluation of the Evolution of Management Strategies of Heart Failure Patients With Implantable Defibrillators (EVOLVO) study, a multicenter clinical trial aimed at measuring the benefits of remote monitoring for heart failure patients with implantable defibrillators. Two hundred patients implanted with a wireless transmission-enabled implantable defibrillator were randomized to receive either remote monitoring or the conventional method of in-person evaluations. Patients were followed for 16 months with a protocol of scheduled in-office and remote follow-ups. The economic evaluation of the intervention was conducted from the perspectives of the health care system and the patient. A cost-utility analysis was performed to measure whether the intervention was cost-effective in terms of cost per quality-adjusted life year (QALY) gained. Overall, remote monitoring did not show significant annual cost savings for the health care system (€1962.78 versus €2130.01; P=.80). There was a significant reduction of the annual cost for the patients in the remote arm in comparison to the standard arm (€291.36 versus €381.34; P=.01). Cost-utility analysis was performed for 180 patients for whom QALYs were available. The patients in the remote arm gained 0.065 QALYs more than those in the standard arm over 16 months, with a cost savings of €888.10 per patient. Results from the cost-utility analysis of the EVOLVO study show that remote monitoring is a cost-effective and dominant solution. Remote management of heart failure patients with implantable defibrillators appears to be cost-effective compared to the conventional method of in-person evaluations. ClinicalTrials.gov NCT00873899; http://clinicaltrials.gov/show/NCT00873899 (Archived by WebCite at http://www.webcitation.org/6H0BOA29f).
Frequency Spectrum Method-Based Stress Analysis for Oil Pipelines in Earthquake Disaster Areas
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline. PMID:25692790
Frequency spectrum method-based stress analysis for oil pipelines in earthquake disaster areas.
Wu, Xiaonan; Lu, Hongfang; Huang, Kun; Wu, Shijuan; Qiao, Weibiao
2015-01-01
When a long distance oil pipeline crosses an earthquake disaster area, inertial force and strong ground motion can cause the pipeline stress to exceed the failure limit, resulting in bending and deformation failure. To date, researchers have performed limited safety analyses of oil pipelines in earthquake disaster areas that include stress analysis. Therefore, using the spectrum method and theory of one-dimensional beam units, CAESAR II is used to perform a dynamic earthquake analysis for an oil pipeline in the XX earthquake disaster area. This software is used to determine if the displacement and stress of the pipeline meet the standards when subjected to a strong earthquake. After performing the numerical analysis, the primary seismic action axial, longitudinal and horizontal displacement directions and the critical section of the pipeline can be located. Feasible project enhancement suggestions based on the analysis results are proposed. The designer is able to utilize this stress analysis method to perform an ultimate design for an oil pipeline in earthquake disaster areas; therefore, improving the safe operation of the pipeline.
NASA Astrophysics Data System (ADS)
Huang, W.-S.; Lin, M.-L.; Liu, H.-C.; Lin, H.-H.
2012-04-01
On April 25, 2010, without rainfall and earthquake triggering a massive landslide (200000 m3) covered a 200m stretch of Taiwan's National Freeway No. 3, killing 4 people, burying three cars and destroying a bridge. The failure mode appears to be a dip-slope type failure occurred on a rock anchorage cut slope. The strike of Tertiary sedimentary strata is northeast-southwest and dip 15˚ toward southeast. Based on the investigations of Taiwan Geotechnical Society, there are three possible factors contributing to the failure mechanism as follow:(1) By toe-excavation during construction in 1998, the daylight of the sliding layer had induced the strength reduction in the sliding layer. It also caused the loadings of anchors increased rapidly and approached to their ultimate capacity; (2) Although the excavated area had stabilized soon with rock anchors and backfills, the weathering and groundwater infiltration caused the strength reduction of overlying rock mass; (3) The possible corrosion and age of the ground anchors deteriorate the loading capacity of rock anchors. Considering the strength of sliding layer had reduced from peak to residual strength which was caused by the disturbance of excavation, the limit equilibrium method (LEM) analysis was utilized in the back analysis at first. The results showed the stability condition of slope approached the critical state (F.S.≈1). The efficiency reduction of rock anchors and strength reduction of overlying stratum (sandstone) had been considered in following analysis. The results showed the unstable condition (F.S. <1). This research also utilized the result of laboratory test, geological strength index(GSI) and finite difference method (FDM, FLAC 5.0) to discuss the failure process with the interaction of disturbance of toe-excavation, weathering of rock mass, groundwater infiltration and efficiency reduction of rock anchors on the stability of slope. The analysis indicated that the incremental load of anchors have similar tendency comparing to the monitoring records in toe-excavation stages. This result showed that the strength of the sliding layer was significantly influenced by toe-excavation. The numerical model which calibrated with monitoring records in excavation stage was then used to discuss the failure process after backfilling. The results showed the interaction of different factors into the failure process. Keyword: Dip slope failure, rock anchor, LEM, FDM, GSI, back analysis
Methodological pitfalls in the analysis of contraceptive failure.
Trussell, J
1991-02-01
Although the literature on contraceptive failure is vast and is expanding rapidly, our understanding of the relative efficacy of methods is quite limited because of defects in the research design and in the analytical tools used by investigators. Errors in the literature range from simple arithmetical mistakes to outright fraud. In many studies the proportion of the original sample lost to follow-up is so large that the published results have little meaning. Investigators do not routinely use life table techniques to control for duration of exposure; many employ the Pearl index, which suffers from the same problem as does the crude death rate as a measure of mortality. Investigators routinely calculate 'method' failure rates by eliminating 'user' failures from the numerator (pregnancies) but fail to eliminate 'imperfect' use from the denominator (exposure); as a consequence, these 'method' rates are biased downward. This paper explores these and other common biases that snare investigators and establishes methodological guidelines for future research.
Anomaly Monitoring Method for Key Components of Satellite
Fan, Linjun; Xiao, Weidong; Tang, Jun
2014-01-01
This paper presented a fault diagnosis method for key components of satellite, called Anomaly Monitoring Method (AMM), which is made up of state estimation based on Multivariate State Estimation Techniques (MSET) and anomaly detection based on Sequential Probability Ratio Test (SPRT). On the basis of analysis failure of lithium-ion batteries (LIBs), we divided the failure of LIBs into internal failure, external failure, and thermal runaway and selected electrolyte resistance (R e) and the charge transfer resistance (R ct) as the key parameters of state estimation. Then, through the actual in-orbit telemetry data of the key parameters of LIBs, we obtained the actual residual value (R X) and healthy residual value (R L) of LIBs based on the state estimation of MSET, and then, through the residual values (R X and R L) of LIBs, we detected the anomaly states based on the anomaly detection of SPRT. Lastly, we conducted an example of AMM for LIBs, and, according to the results of AMM, we validated the feasibility and effectiveness of AMM by comparing it with the results of threshold detective method (TDM). PMID:24587703
Analysis on IGBT and Diode Failures in Distribution Electronic Power Transformers
NASA Astrophysics Data System (ADS)
Wang, Si-cong; Sang, Zi-xia; Yan, Jiong; Du, Zhi; Huang, Jia-qi; Chen, Zhu
2018-02-01
Fault characteristics of power electronic components are of great importance for a power electronic device, and are of extraordinary importance for those applied in power system. The topology structures and control method of Distribution Electronic Power Transformer (D-EPT) are introduced, and an exploration on fault types and fault characteristics for the IGBT and diode failures is presented. The analysis and simulation of different fault types for the fault characteristics lead to the D-EPT fault location scheme.
NASA Astrophysics Data System (ADS)
Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal
2013-07-01
The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.
Challenges in Resolution for IC Failure Analysis
NASA Astrophysics Data System (ADS)
Martinez, Nick
1999-10-01
Resolution is becoming more and more of a challenge in the world of Failure Analysis in integrated circuits. This is a result of the ongoing size reduction in microelectronics. Determining the cause of a failure depends upon being able to find the responsible defect. The time it takes to locate a given defect is extremely important so that proper corrective actions can be taken. The limits of current microscopy tools are being pushed. With sub-micron feature sizes and even smaller killing defects, optical microscopes are becoming obsolete. With scanning electron microscopy (SEM), the resolution is high but the voltage involved can make these small defects transparent due to the large mean-free path of incident electrons. In this presentation, I will give an overview of the use of inspection methods in Failure Analysis and show example studies of my work as an Intern student at Texas Instruments. 1. Work at Texas Instruments, Stafford, TX, was supported by TI. 2. Work at Texas Tech University, was supported by NSF Grant DMR9705498.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, D. D., E-mail: dandan.wang@globalfoundries.com; Huang, Y. M.; Tan, P. K.
2015-12-15
Presently two major limiting factors are hindering the failure analysis (FA) development during the semiconductor manufacturing process and technology improvement: (1) Impossibility of manual polishing on the edge dies due to the amenability of layer peeling off; (2) Abundant demand of multi-locations FA, especially focusing different levels of layers simultaneously. Aiming at resolving these limitations, here we demonstrate two unique high precision polishing methods by using focused ion beam (FIB) technique. One is the vertical top down chemical etching at the aimed location; the other one is the planar top down slicing. Using the FIB for delayering not only solvesmore » these problems mentioned above, but also offers significant advantages over physical planar polishing methods such as: (1) having a better control of the delayering progress, (2) enabling precisely milling at a region of interest, (3) providing the prevention of over-delayering and (4) possessing capability to capture images at the region of interest simultaneously and cut into the die directly to expose the exact failure without damaging other sections of the specimen.« less
Evaluation of marginal failures of dental composite restorations by acoustic emission analysis.
Gu, Ja-Uk; Choi, Nak-Sam
2013-01-01
In this study, a nondestructive method based on acoustic emission (AE) analysis was developed to evaluate the marginal failure states of dental composite restorations. Three types of ring-shaped substrates, which were modeled after a Class I cavity, were prepared from polymethyl methacrylate, stainless steel, and human molar teeth. A bonding agent and a composite resin were applied to the ring-shaped substrates and cured by light exposure. At each time-interval measurement, the tooth substrate presented a higher number of AE hits than polymethyl methacrylate and steel substrates. Marginal disintegration estimations derived from cumulative AE hits and cumulative AE energy parameters showed that a signification portion of marginal gap formation was already realized within 1 min at the initial light-curing stage. Estimation based on cumulative AE energy gave a higher level of marginal failure than that based on AE hits. It was concluded that the AE analysis method developed in this study was a viable approach in predicting the clinical survival of dental composite restorations efficiently within a short test period.
Task Decomposition in Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald Laurids; Joe, Jeffrey Clark
2014-06-01
In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less
Kersten, Daniel J; Yi, Jinju; Feldman, Alyssa M; Brahmbhatt, Kunal; Asheld, Wilbur J; Germano, Joseph; Islam, Shahidul; Cohen, Todd J
2016-12-01
The purpose of this study was to determine if implantation of multiple recalled defibrillator leads is associated with an increased risk of lead failure. The authors of the Pacemaker and Implantable Defibrillator Leads Survival Study ("PAIDLESS") have previously reported a relationship between recalled lead status, lead failure, and patient mortality. This substudy analyzes the relationship in a smaller subset of patients who received more than one recalled lead. The specific effects of having one or more recalled leads have not been previously examined. This study analyzed lead failure and mortality of 3802 patients in PAIDLESS and compared outcomes with respect to the number of recalled leads received. PAIDLESS includes all patients at Winthrop University Hospital who underwent defibrillator lead implantation between February 1, 1996 and December 31, 2011. Patients with no recalled ICD leads, one recalled ICD lead, and two recalled ICD leads were compared using the Kaplan-Meier method and log-rank test. Sidak adjustment method was used to correct for multiple comparisons. All calculations were performed using SAS 9.4. P-values <.05 were considered statistically significant. This study included 4078 total ICD leads implanted during the trial period. There were 2400 leads (59%) in the no recalled leads category, 1620 leads (40%) in the one recalled lead category, and 58 leads (1%) in the two recalled leads category. No patient received more than two recalled leads. Of the leads categorized in the two recalled leads group, 12 experienced lead failures (21%), which was significantly higher (P<.001) than in the no recalled leads group (60 failures, 2.5%) and one recalled lead group (81 failures; 5%). Multivariable Cox's regression analysis found a total of six significant predictive variables for lead failure including the number of recalled leads (P<.001 for one and two recalled leads group). The number of recalled leads is highly predictive of lead failure. Lead-based multivariable Cox's regression analysis produced a total of six predictive variable categories for lead failure, one of which was the number of recalled leads. Kaplan-Meier analysis showed that the leads in the two recalled leads category failed faster than both the no recalled lead and one recalled lead groups. The greater the number of recalled leads to which patients are exposed, the greater the risk of lead failure.
Shahrbabaki, Parvin Mangolian; Nouhi, Esmat; Kazemi, Majid; Ahmadi, Fazlollah
2016-01-01
Background Heart failure as a chronic disease poses many challenges for a patient in his or her everyday life. Support in various aspects of life positively affects coping strategies and influences the well-being and health outcomes of heart failure patients. Inadequate support may lead to a worsening of symptoms, increased hospital readmissions, psychological disorders, and a reduced quality of life. Objective This study explored obstacles to coping related to support for heart failure patients as viewed by the patients themselves and their family members and caregivers. Design This qualitative study was conducted using content analysis. The 20 Iranian participants included 11 patients with heart failure, three cardiologists, three nurses, and three family members of heart failure patients selected through purposive sampling. Data were collected through semi-structured interviews and analyzed using the Lundman and Graneheim qualitative content analysis method. Results During data analysis, ‘defective support network’ developed as the main theme along with four other categories of ‘inadequate family performance’, ‘inadequate support by the healthcare team’, ‘distorted societal social support’, and ‘inadequate welfare support’. Conclusion The findings of the current study can assist health authorities and planners in identifying the needs of patients with heart failure so as to focus and plan on facilitating their coping as much as possible by obviating the existing obstacles. PMID:27041539
Foo, Jonathan; Ilic, Dragan; Rivers, George; Evans, Darrell J R; Walsh, Kieran; Haines, Terry P; Paynter, Sophie; Morgan, Prue; Maloney, Stephen
2017-12-07
Student failure creates additional economic costs. Knowing the cost of failure helps to frame its economic burden relative to other educational issues, providing an evidence-base to guide priority setting and allocation of resources. The Ingredients Method is a cost-analysis approach which has been previously applied to health professions education research. In this study, the Ingredients Method is introduced, and applied to a case study, investigating the cost of pre-clinical student failure. The four step Ingredients Method was introduced and applied: (1) identify and specify resource items, (2) measure volume of resources in natural units, (3) assign monetary prices to resource items, and (4) analyze and report costs. Calculations were based on a physiotherapy program at an Australian university. The cost of failure was £5991 per failing student, distributed across students (70%), the government (21%), and the university (8%). If the cost of failure and attrition is distributed among the remaining continuing cohort, the cost per continuing student educated increases from £9923 to £11,391 per semester. The economics of health professions education is complex. Researchers should consider both accuracy and feasibility in their costing approach, toward the goal of better informing cost-conscious decision-making.
Dewan, Michael C; Lim, Jaims; Shannon, Chevis N; Wellons, John C
2017-05-01
OBJECTIVE Up to one-third of patients with a posterior fossa brain tumor (PFBT) will experience persistent hydrocephalus mandating permanent CSF diversion. The optimal hydrocephalus treatment modality is unknown; the authors sought to compare the durability between endoscopic third ventriculostomy (ETV) and ventriculoperitoneal shunt (VPS) therapy in the pediatric population. METHODS The authors conducted a systematic review of articles indexed in PubMed between 1986 and 2016 describing ETV and/or VPS treatment success/failure and time-to-failure rate in patients < 19 years of age with hydrocephalus related to a PFBT. Additionally, the authors conducted a retrospective review of their institutional series of PFBT patients requiring CSF diversion. Patient data from the systematic review and from the institutional series were aggregated and a time-to-failure analysis was performed comparing ETV and VPS using the Kaplan-Meier method. RESULTS A total of 408 patients were included from 12 studies and the authors' institutional series: 284 who underwent ETV and 124 who underwent VPS placement. The analysis included uncontrolled studies with variable method and timing of CSF diversion and were subject to surgeon bias. No significant differences between cohorts were observed with regard to age, sex, tumor grade or histology, metastatic status, or extent of resection. The cumulative failure rate of ETV was 21%, whereas that of VPS surgery was 29% (p = 0.105). The median time to failure was earlier for ETV than for VPS surgery (0.82 [IQR 0.2-1.8] vs 4.7 months [IQR 0.3-5.7], p = 0.03). Initially the ETV survival curve dropped sharply and then stabilized around 2 months. The VPS curve fell gradually but eventually crossed below the ETV curve at 5.7 months. Overall, a significant survival advantage was not demonstrated for one procedure over the other (p = 0.21, log-rank). However, postoperative complications were higher following VPS (31%) than ETV (17%) (p = 0.012). CONCLUSIONS ETV failure occurred sooner than VPS failure, but long-term treatment durability may be higher for ETV. Complications occurred more commonly with VPS than with ETV. Limited clinical conclusions are drawn using this methodology; the optimal treatment for PFBT-related hydrocephalus warrants investigation through prospective studies.
NASA Astrophysics Data System (ADS)
Makarova, A. N.; Makarov, E. I.; Zakharov, N. S.
2018-03-01
In the article, the issue of correcting engineering servicing regularity on the basis of actual dependability data of cars in operation is considered. The purpose of the conducted research is to increase dependability of transport-technological machines by correcting engineering servicing regularity. The subject of the research is the mechanism of engineering servicing regularity influence on reliability measure. On the basis of the analysis of researches carried out before, a method of nonparametric estimation of car failure measure according to actual time-to-failure data was chosen. A possibility of describing the failure measure dependence on engineering servicing regularity by various mathematical models is considered. It is proven that the exponential model is the most appropriate for that purpose. The obtained results can be used as a separate method of engineering servicing regularity correction with certain operational conditions taken into account, as well as for the technical-economical and economical-stochastic methods improvement. Thus, on the basis of the conducted researches, a method of engineering servicing regularity correction of transport-technological machines in the operational process was developed. The use of that method will allow decreasing the number of failures.
Health assessment of cooling fan bearings using wavelet-based filtering.
Miao, Qiang; Tang, Chao; Liang, Wei; Pecht, Michael
2012-12-24
As commonly used forced convection air cooling devices in electronics, cooling fans are crucial for guaranteeing the reliability of electronic systems. In a cooling fan assembly, fan bearing failure is a major failure mode that causes excessive vibration, noise, reduction in rotation speed, locked rotor, failure to start, and other problems; therefore, it is necessary to conduct research on the health assessment of cooling fan bearings. This paper presents a vibration-based fan bearing health evaluation method using comblet filtering and exponentially weighted moving average. A new health condition indicator (HCI) for fan bearing degradation assessment is proposed. In order to collect the vibration data for validation of the proposed method, a cooling fan accelerated life test was conducted to simulate the lubricant starvation of fan bearings. A comparison between the proposed method and methods in previous studies (i.e., root mean square, kurtosis, and fault growth parameter) was carried out to assess the performance of the HCI. The analysis results suggest that the HCI can identify incipient fan bearing failures and describe the bearing degradation process. Overall, the work presented in this paper provides a promising method for fan bearing health evaluation and prognosis.
Health Assessment of Cooling Fan Bearings Using Wavelet-Based Filtering
Miao, Qiang; Tang, Chao; Liang, Wei; Pecht, Michael
2013-01-01
As commonly used forced convection air cooling devices in electronics, cooling fans are crucial for guaranteeing the reliability of electronic systems. In a cooling fan assembly, fan bearing failure is a major failure mode that causes excessive vibration, noise, reduction in rotation speed, locked rotor, failure to start, and other problems; therefore, it is necessary to conduct research on the health assessment of cooling fan bearings. This paper presents a vibration-based fan bearing health evaluation method using comblet filtering and exponentially weighted moving average. A new health condition indicator (HCI) for fan bearing degradation assessment is proposed. In order to collect the vibration data for validation of the proposed method, a cooling fan accelerated life test was conducted to simulate the lubricant starvation of fan bearings. A comparison between the proposed method and methods in previous studies (i.e., root mean square, kurtosis, and fault growth parameter) was carried out to assess the performance of the HCI. The analysis results suggest that the HCI can identify incipient fan bearing failures and describe the bearing degradation process. Overall, the work presented in this paper provides a promising method for fan bearing health evaluation and prognosis. PMID:23262486
Progressive Failure And Life Prediction of Ceramic and Textile Composites
NASA Technical Reports Server (NTRS)
Xue, David Y.; Shi, Yucheng; Katikala, Madhu; Johnston, William M., Jr.; Card, Michael F.
1998-01-01
An engineering approach to predict the fatigue life and progressive failure of multilayered composite and textile laminates is presented. Analytical models which account for matrix cracking, statistical fiber failures and nonlinear stress-strain behavior have been developed for both composites and textiles. The analysis method is based on a combined micromechanics, fracture mechanics and failure statistics analysis. Experimentally derived empirical coefficients are used to account for the interface of fiber and matrix, fiber strength, and fiber-matrix stiffness reductions. Similar approaches were applied to textiles using Repeating Unit Cells. In composite fatigue analysis, Walker's equation is applied for matrix fatigue cracking and Heywood's formulation is used for fiber strength fatigue degradation. The analysis has been compared with experiment with good agreement. Comparisons were made with Graphite-Epoxy, C/SiC and Nicalon/CAS composite materials. For textile materials, comparisons were made with triaxial braided and plain weave materials under biaxial or uniaxial tension. Fatigue predictions were compared with test data obtained from plain weave C/SiC materials tested at AS&M. Computer codes were developed to perform the analysis. Composite Progressive Failure Analysis for Laminates is contained in the code CPFail. Micromechanics Analysis for Textile Composites is contained in the code MicroTex. Both codes were adapted to run as subroutines for the finite element code ABAQUS and CPFail-ABAQUS and MicroTex-ABAQUS. Graphic user interface (GUI) was developed to connect CPFail and MicroTex with ABAQUS.
NASA Technical Reports Server (NTRS)
Kradinov, V.; Madenci, E.; Ambur, D. R.
2004-01-01
Although two-dimensional methods provide accurate predictions of contact stresses and bolt load distribution in bolted composite joints with multiple bolts, they fail to capture the effect of thickness on the strength prediction. Typically, the plies close to the interface of laminates are expected to be the most highly loaded, due to bolt deformation, and they are usually the first to fail. This study presents an analysis method to account for the variation of stresses in the thickness direction by augmenting a two-dimensional analysis with a one-dimensional through the thickness analysis. The two-dimensional in-plane solution method based on the combined complex potential and variational formulation satisfies the equilibrium equations exactly, and satisfies the boundary conditions and constraints by minimizing the total potential. Under general loading conditions, this method addresses multiple bolt configurations without requiring symmetry conditions while accounting for the contact phenomenon and the interaction among the bolts explicitly. The through-the-thickness analysis is based on the model utilizing a beam on an elastic foundation. The bolt, represented as a short beam while accounting for bending and shear deformations, rests on springs, where the spring coefficients represent the resistance of the composite laminate to bolt deformation. The combined in-plane and through-the-thickness analysis produces the bolt/hole displacement in the thickness direction, as well as the stress state in each ply. The initial ply failure predicted by applying the average stress criterion is followed by a simple progressive failure. Application of the model is demonstrated by considering single- and double-lap joints of metal plates bolted to composite laminates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Manger, R; Kim, G
Purpose: To examine the ability of traditional Failure mode and effects analysis (FMEA) and a light version of Healthcare FMEA (HFMEA), called Scenario analysis of FMEA (SAFER) by comparing their outputs in terms of the risks identified and their severity rankings. Methods: We applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation are based on risk priority number (RPN). RPN is a product of three indices: occurrence, severity and detectability. The SAFER approach; utilized two indices-frequency and severity-which were defined by a multidisciplinarymore » team. A criticality matrix was divided into 4 categories; very low, low, high and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. Results: Two methods were independently compared to determine if the results and rated risks were matching or not. Our results showed an agreement of 67% between FMEA and SAFER approaches for the 15 riskiest SIG-specific failure modes. The main differences between the two approaches were the distribution of the values and the failure modes (No.52, 54, 154) that have high SAFER scores do not necessarily have high FMEA RPN scores. In our results, there were additional risks identified by both methods with little correspondence. In the SAFER, when the risk score is determined, the basis of the established decision tree or the failure mode should be more investigated. Conclusion: The FMEA method takes into account the probability that an error passes without being detected. SAFER is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allow the prioritization of risks and mitigation measures, and thus is perfectly applicable to clinical parts of radiotherapy.« less
Failure mode and effects analysis of witnessing protocols for ensuring traceability during IVF.
Rienzi, Laura; Bariani, Fiorenza; Dalla Zorza, Michela; Romano, Stefania; Scarica, Catello; Maggiulli, Roberta; Nanni Costa, Alessandro; Ubaldi, Filippo Maria
2015-10-01
Traceability of cells during IVF is a fundamental aspect of treatment, and involves witnessing protocols. Failure mode and effects analysis (FMEA) is a method of identifying real or potential breakdowns in processes, and allows strategies to mitigate risks to be developed. To examine the risks associated with witnessing protocols, an FMEA was carried out in a busy IVF centre, before and after implementation of an electronic witnessing system (EWS). A multidisciplinary team was formed and moderated by human factors specialists. Possible causes of failures, and their potential effects, were identified and risk priority number (RPN) for each failure calculated. A second FMEA analysis was carried out after implementation of an EWS. The IVF team identified seven main process phases, 19 associated process steps and 32 possible failure modes. The highest RPN was 30, confirming the relatively low risk that mismatches may occur in IVF when a manual witnessing system is used. The introduction of the EWS allowed a reduction in the moderate-risk failure mode by two-thirds (highest RPN = 10). In our experience, FMEA is effective in supporting multidisciplinary IVF groups to understand the witnessing process, identifying critical steps and planning changes in practice to enable safety to be enhanced. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Reliability analysis based on the losses from failures.
Todinov, M T
2006-04-01
The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the early-life failures region and the expected losses given failure characterizing the corresponding time intervals. For complex systems whose components are not logically arranged in series, discrete simulation algorithms and software have been created for determining the losses from failures in terms of expected lost production time, cost of intervention, and cost of replacement. Different system topologies are assessed to determine the effect of modifications of the system topology on the expected losses from failures. It is argued that the reliability allocation in a production system should be done to maximize the profit/value associated with the system. Consequently, a method for setting reliability requirements and reliability allocation maximizing the profit by minimizing the total cost has been developed. Reliability allocation that maximizes the profit in case of a system consisting of blocks arranged in series is achieved by determining for each block individually the reliabilities of the components in the block that minimize the sum of the capital, operation costs, and the expected losses from failures. A Monte Carlo simulation based net present value (NPV) cash-flow model has also been proposed, which has significant advantages to cash-flow models based on the expected value of the losses from failures per time interval. Unlike these models, the proposed model has the capability to reveal the variation of the NPV due to different number of failures occurring during a specified time interval (e.g., during one year). The model also permits tracking the impact of the distribution pattern of failure occurrences and the time dependence of the losses from failures.
2010-01-01
Introduction Sterilization and re-usage of tumour bone for reconstruction after tumour resection is now gaining popularity in the East. This recycle tumour bone needs to be sterilized in order to eradicate the tumour cells before re-implantation for limb salvage procedures. The effect of some of these treatments on the integrity and sterility of the bone after treatment has been published but there has yet been a direct comparison between the various methods of sterilization to determine the one method that gives the best tumour kill without compromising the bone's structural integrity. Method This study was performed to evaluate the effect of several sterilization methods on the mechanical behavior of human cortical bone graft and histopathology evaluation of tumour bone samples after being processed with 4 different methods of sterilization. Fresh human cortical tumour bone is harvested from the diaphyseal region of the tumour bone were sterilized by autoclave (n =10); boiling (n =10); pasteurization (n =10); and irradiation (n =10). There were also 10 control specimens that did not receive any form of sterilization treatment. The biomechanical test conducted were stress to failure, modulus and strain to failure, which were determined from axial compression testing. Statistical analysis (ANOVA) was performed on these results. Significance level (α) and power (β) were set to 0.05 and 0.90, respectively. Results ANOVA analysis of 'failure stress', 'modulus' and 'strain to failure' demonstrated significant differences (p < 0.05) between treated cortical bone and untreated specimens under mechanical loading. 'Stress to failure' was significantly reduced in boiled, autoclaved and irradiated cortical bone samples (p < 0.05). 'Modulus' detected significant differences in the boiled, autoclaved and pasteurization specimens compared to controls (p < 0.05). 'Strain to failure' was reduced by irradiation (p < 0.05) but not by the other three methods of treatments. Histopathology study revealed no viable tumour cell in any of four types of treatment group compared to the untreated control group. Conclusions Sterilization of cortical bone sample by pasteurization and to a lesser extent, irradiation does not significantly alter the mechanical properties when compared with untreated samples. Mechanical properties degrade with the use of high temperature for sterilization (boiling). All methods of sterilization gave rise to 100 percent tumour kill. PMID:20831801
Mechanistic Considerations Used in the Development of the PROFIT PCI Failure Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankaskie, P. J.
A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) Interactions (PC!) failure model for estimating the probability of failure in !ransient increases in power (PROFIT) was developed. PROFIT is based on 1) standard statistical methods applied to available PC! fuel failure data and 2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmentalmore » and strain-rate dependent strain energy absorption to failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-disloction interaction effects in the Zircaloy cladding. Assuming that the power ramping rate is the operating corollary of strain-rate in the Zircaloy cladding, then the variables of first order importance in the PCI fuel failure phenomenon are postulated to be: 1. pre-transient fuel rod power, P{sub I}, 2. transient increase in fuel rod power, {Delta}P, 3. fuel burnup, Bu, and 4. the constitutive material property of the Zircaloy cladding, SEAF.« less
Failure probability analysis of optical grid
NASA Astrophysics Data System (ADS)
Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng
2008-11-01
Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.
Mesh Deformation Based on Fully Stressed Design: The Method and Two-Dimensional Examples
NASA Technical Reports Server (NTRS)
Hsu, Su-Yuen; Chang, Chau-Lyan
2007-01-01
Mesh deformation in response to redefined boundary geometry is a frequently encountered task in shape optimization and analysis of fluid-structure interaction. We propose a simple and concise method for deforming meshes defined with three-node triangular or four-node tetrahedral elements. The mesh deformation method is suitable for large boundary movement. The approach requires two consecutive linear elastic finite-element analyses of an isotropic continuum using a prescribed displacement at the mesh boundaries. The first analysis is performed with homogeneous elastic property and the second with inhomogeneous elastic property. The fully stressed design is employed with a vanishing Poisson s ratio and a proposed form of equivalent strain (modified Tresca equivalent strain) to calculate, from the strain result of the first analysis, the element-specific Young s modulus for the second analysis. The theoretical aspect of the proposed method, its convenient numerical implementation using a typical linear elastic finite-element code in conjunction with very minor extra coding for data processing, and results for examples of large deformation of two-dimensional meshes are presented in this paper. KEY WORDS: Mesh deformation, shape optimization, fluid-structure interaction, fully stressed design, finite-element analysis, linear elasticity, strain failure, equivalent strain, Tresca failure criterion
NASA Technical Reports Server (NTRS)
Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl
2017-01-01
Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; James Knudsen
As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less
Stress Transfer and Structural Failure of Bilayered Material Systems
NASA Astrophysics Data System (ADS)
Prieto-Munoz, Pablo Arthur
Bilayered material systems are common in naturally formed or artificially engineered structures. Understanding how loads transfer within these structural systems is necessary to predict failure and develop effective designs. Existing methods for evaluating the stress transfer in bilayered materials are limited to overly simplified models or require experimental calibration. As a result, these methods have failed to accurately account for such structural failures as the creep induced roofing panel collapse of Boston's I-90 connector tunnel, which was supported by adhesive anchors. The one-dimensional stress analyses currently used for adhesive anchor design cannot account for viscoelastic creep failure, and consequently results in dangerously under-designed structural systems. In this dissertation, a method for determining the two-dimensional stress and displacement fields for a generalized bilayered material system is developed, and proposes a closed-form analytical solution. A general linear-elastic solution is first proposed by decoupling the elastic governing equations from one another through the so-called plane assumption. Based on this general solution, an axisymmetric problem and a plane strain problem are formulated. These are applied to common bilayered material systems such as: (1) concrete adhesive anchors, (2) material coatings, (3) asphalt pavements, and (4) layered sedimentary rocks. The stress and displacement fields determined by this analytical analysis are validated through the use of finite element models. Through the correspondence principle, the linear-elastic solution is extended to consider time-dependent viscoelastic material properties, thus facilitating the analysis of adhesive anchors and asphalt pavements while incorporating their viscoelastic material behavior. Furthermore, the elastic stress analysis can explain the fracturing phenomenon of material coatings, pavements, and layered rocks, successfully predicting their fracture saturation ratio---which is the ratio of fracture spacing to the thickness of the weak layer where an increase in load will not cause any new fractures to form. Moreover, these specific material systems are looked at in the context of existing and novel experimental results, further demonstrating the advantage of the stress transfer analysis proposed. This research provides a closed-form stress solution for various structural systems that is applied to different failure analyses. The versatility of this method is in the flexibility and the ease upon which the stress and displacement field results can be applied to existing stress- or displacement-based structural failure criteria. As presented, this analysis can be directly used to: (1) design adhesive anchoring systems for long-term creep loading, (2) evaluate the fracture mechanics behind bilayered material coatings and pavement overlay systems, and (3) determine the fracture spacing to layer thickness ratio of layered sedimentary rocks. As is shown in the four material systems presented, this general solution has far reaching applications in facilitating design and analysis of typical bilayered structural systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoisak, J; Manger, R; Dragojevic, I
Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less
FAILURE OF RADIOACTIVE IODINE IN TREATMENT OF HYPERTHYROIDISM
Schneider, David F.; Sonderman, Philip E.; Jones, Michaela F.; Ojomo, Kristin A.; Chen, Herbert; Jaume, Juan C.; Elson, Diane F.; Perlman, Scott B.; Sippel, Rebecca S.
2015-01-01
Introduction Persistent or recurrent hyperthyroidism after treatment with radioactive iodine (RAI) is common, and many patients require either additional doses or surgery before they are cured. The purpose of this study was to identify patterns and predictors of failure of RAI in patients with hyperthyroidism. Methods We conducted a retrospective review of patients treated with RAI from 2007–2010. Failure of RAI was defined as receipt of additional dose(s) and/or total thyroidectomy. Using a Cox proportional hazards model, we conducted univariate analysis to identify factors associated with failure of RAI. A final multivariate model was then constructed with significant (p < 0.05) variables from the univariate analysis. Results Of the 325 patients analyzed, 74 patients (22.8%) failed initial RAI treatment. 53 (71.6%) received additional RAI, 13 (17.6%) received additional RAI followed by surgery, and the remaining 8 (10.8%) were cured after thyroidectomy. The percentage of patients who failed decreased in a step-wise fashion as RAI dose increased. Similarly, the incidence of failure increased as the presenting T3 level increased. Sensitivity analysis revealed that RAI doses < 12.5 mCi were associated with failure while initial T3 and free T4 levels of at least 4.5 pg/mL and 2.3 ng/dL, respectively, were associated with failure. In the final multivariate analysis, higher T4 (HR 1.13, 95% CI 1.02–1.26, p=0.02) and methimazole treatment (HR 2.55, 95% CI 1.22–5.33, p=0.01) were associated with failure. Conclusions Laboratory values at presentation can predict which patients with hyperthyroidism are at risk for failing RAI treatment. Higher doses of RAI or surgical referral may prevent the need for repeat RAI in selected patients. PMID:25001092
Bidirectional Cardio-Respiratory Interactions in Heart Failure.
Radovanović, Nikola N; Pavlović, Siniša U; Milašinović, Goran; Kirćanski, Bratislav; Platiša, Mirjana M
2018-01-01
We investigated cardio-respiratory coupling in patients with heart failure by quantification of bidirectional interactions between cardiac (RR intervals) and respiratory signals with complementary measures of time series analysis. Heart failure patients were divided into three groups of twenty, age and gender matched, subjects: with sinus rhythm (HF-Sin), with sinus rhythm and ventricular extrasystoles (HF-VES), and with permanent atrial fibrillation (HF-AF). We included patients with indication for implantation of implantable cardioverter defibrillator or cardiac resynchronization therapy device. ECG and respiratory signals were simultaneously acquired during 20 min in supine position at spontaneous breathing frequency in 20 healthy control subjects and in patients before device implantation. We used coherence, Granger causality and cross-sample entropy analysis as complementary measures of bidirectional interactions between RR intervals and respiratory rhythm. In heart failure patients with arrhythmias (HF-VES and HF-AF) there is no coherence between signals ( p < 0.01), while in HF-Sin it is reduced ( p < 0.05), compared with control subjects. In all heart failure groups causality between signals is diminished, but with significantly stronger causality of RR signal in respiratory signal in HF-VES. Cross-sample entropy analysis revealed the strongest synchrony between respiratory and RR signal in HF-VES group. Beside respiratory sinus arrhythmia there is another type of cardio-respiratory interaction based on the synchrony between cardiac and respiratory rhythm. Both of them are altered in heart failure patients. Respiratory sinus arrhythmia is reduced in HF-Sin patients and vanished in heart failure patients with arrhythmias. Contrary, in HF-Sin and HF-VES groups, synchrony increased, probably as consequence of some dominant neural compensatory mechanisms. The coupling of cardiac and respiratory rhythm in heart failure patients varies depending on the presence of atrial/ventricular arrhythmias and it could be revealed by complementary methods of time series analysis.
Kremen, Arie; Tsompanakis, Yiannis
2010-04-01
The slope-stability of a proposed vertical extension of a balefill was investigated in the present study, in an attempt to determine a geotechnically conservative design, compliant with New Jersey Department of Environmental Protection regulations, to maximize the utilization of unclaimed disposal capacity. Conventional geotechnical analytical methods are generally limited to well-defined failure modes, which may not occur in landfills or balefills due to the presence of preferential slip surfaces. In addition, these models assume an a priori stress distribution to solve essentially indeterminate problems. In this work, a different approach has been applied, which avoids several of the drawbacks of conventional methods. Specifically, the analysis was performed in a two-stage process: (a) calculation of stress distribution, and (b) application of an optimization technique to identify the most probable failure surface. The stress analysis was performed using a finite element formulation and the location of the failure surface was located by dynamic programming optimization method. A sensitivity analysis was performed to evaluate the effect of the various waste strength parameters of the underlying mathematical model on the results, namely the factor of safety of the landfill. Although this study focuses on the stability investigation of an expanded balefill, the methodology presented can easily be applied to general geotechnical investigations.
Regression analysis of informative current status data with the additive hazards model.
Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo
2015-04-01
This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.
NASA Astrophysics Data System (ADS)
Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.
2017-12-01
The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.
Operational modes, health, and status monitoring
NASA Astrophysics Data System (ADS)
Taljaard, Corrie
2016-08-01
System Engineers must fully understand the system, its support system and operational environment to optimise the design. Operations and Support Managers must also identify the correct metrics to measure the performance and to manage the operations and support organisation. Reliability Engineering and Support Analysis provide methods to design a Support System and to optimise the Availability of a complex system. Availability modelling and Failure Analysis during the design is intended to influence the design and to develop an optimum maintenance plan for a system. The remote site locations of the SKA Telescopes place emphasis on availability, failure identification and fault isolation. This paper discusses the use of Failure Analysis and a Support Database to design a Support and Maintenance plan for the SKA Telescopes. It also describes the use of modelling to develop an availability dashboard and performance metrics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, F; Cao, N; Young, L
2014-06-15
Purpose: Though FMEA (Failure Mode and Effects Analysis) is becoming more widely adopted for risk assessment in radiation therapy, to our knowledge it has never been validated against actual incident learning data. The objective of this study was to perform an FMEA analysis of an SBRT (Stereotactic Body Radiation Therapy) treatment planning process and validate this against data recorded within an incident learning system. Methods: FMEA on the SBRT treatment planning process was carried out by a multidisciplinary group including radiation oncologists, medical physicists, and dosimetrists. Potential failure modes were identified through a systematic review of the workflow process. Failuremore » modes were rated for severity, occurrence, and detectability on a scale of 1 to 10 and RPN (Risk Priority Number) was computed. Failure modes were then compared with historical reports identified as relevant to SBRT planning within a departmental incident learning system that had been active for two years. Differences were identified. Results: FMEA identified 63 failure modes. RPN values for the top 25% of failure modes ranged from 60 to 336. Analysis of the incident learning database identified 33 reported near-miss events related to SBRT planning. FMEA failed to anticipate 13 of these events, among which 3 were registered with severity ratings of severe or critical in the incident learning system. Combining both methods yielded a total of 76 failure modes, and when scored for RPN the 13 events missed by FMEA ranked within the middle half of all failure modes. Conclusion: FMEA, though valuable, is subject to certain limitations, among them the limited ability to anticipate all potential errors for a given process. This FMEA exercise failed to identify a significant number of possible errors (17%). Integration of FMEA with retrospective incident data may be able to render an improved overview of risks within a process.« less
Differential reliability : probabilistic engineering applied to wood members in bending-tension
Stanley K. Suddarth; Frank E. Woeste; William L. Galligan
1978-01-01
Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...
NASA Astrophysics Data System (ADS)
Zuo, Ye; Sun, Guangjun; Li, Hongjing
2018-01-01
Under the action of near-fault ground motions, curved bridges are prone to pounding, local damage of bridge components and even unseating. A multi-scale fine finite element model of a typical three-span curved bridge is established by considering the elastic-plastic behavior of piers and pounding effect of adjacent girders. The nonlinear time-history method is used to study the seismic response of the curved bridge equipped with unseating failure control system under the action of near-fault ground motion. An in-depth analysis is carried to evaluate the control effect of the proposed unseating failure control system. The research results indicate that under the near-fault ground motion, the seismic response of the curved bridge is strong. The unseating failure control system perform effectively to reduce the pounding force of the adjacent girders and the probability of deck unseating.
Using pattern analysis methods to do fast detection of manufacturing pattern failures
NASA Astrophysics Data System (ADS)
Zhao, Evan; Wang, Jessie; Sun, Mason; Wang, Jeff; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
At the advanced technology node, logic design has become extremely complex and is getting more challenging as the pattern geometry size decreases. The small sizes of layout patterns are becoming very sensitive to process variations. Meanwhile, the high pressure of yield ramp is always there due to time-to-market competition. The company that achieves patterning maturity earlier than others will have a great advantage and a better chance to realize maximum profit margins. For debugging silicon failures, DFT diagnostics can identify which nets or cells caused the yield loss. But normally, a long time period is needed with many resources to identify which failures are due to one common layout pattern or structure. This paper will present a new yield diagnostic flow, based on preliminary EFA results, to show how pattern analysis can more efficiently detect pattern related systematic defects. Increased visibility on design pattern related failures also allows more precise yield loss estimation.
Full-field local displacement analysis of two-sided paperboard
J.M. Considine; D.W. Vahey
2007-01-01
This report describes a method to examine full-field displacements of both sides of paperboard during tensile testing. Analysis showed out-of-plane shear behavior near the failures zones. The method was reliably used to examine out-of-plane shear in double notch shear specimens. Differences in shear behavior of machine direction and cross-machine direction specimens...
Managing heart failure in the long-term care setting: nurses' experiences in Ontario, Canada.
Strachan, Patricia H; Kaasalainen, Sharon; Horton, Amy; Jarman, Hellen; D'Elia, Teresa; Van Der Horst, Mary-Lou; Newhouse, Ian; Kelley, Mary Lou; McAiney, Carrie; McKelvie, Robert; Heckman, George A
2014-01-01
Implementation of heart failure guidelines in long-term care (LTC) settings is challenging. Understanding the conditions of nursing practice can improve management, reduce suffering, and prevent hospital admission of LTC residents living with heart failure. The aim of the study was to understand the experiences of LTC nurses managing care for residents with heart failure. This was a descriptive qualitative study nested in Phase 2 of a three-phase mixed methods project designed to investigate barriers and solutions to implementing the Canadian Cardiovascular Society heart failure guidelines into LTC homes. Five focus groups totaling 33 nurses working in LTC settings in Ontario, Canada, were audiorecorded, then transcribed verbatim, and entered into NVivo9. A complex adaptive systems framework informed this analysis. Thematic content analysis was conducted by the research team. Triangulation, rigorous discussion, and a search for negative cases were conducted. Data were collected between May and July 2010. Nurses characterized their experiences managing heart failure in relation to many influences on their capacity for decision-making in LTC settings: (a) a reactive versus proactive approach to chronic illness; (b) ability to interpret heart failure signs, symptoms, and acuity; (c) compromised information flow; (d) access to resources; and (e) moral distress. Heart failure guideline implementation reflects multiple dynamic influences. Leadership that addresses these factors is required to optimize the conditions of heart failure care and related nursing practice.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
Intelligent data analysis: the best approach for chronic heart failure (CHF) follow up management.
Mohammadzadeh, Niloofar; Safdari, Reza; Baraani, Alireza; Mohammadzadeh, Farshid
2014-08-01
Intelligent data analysis has ability to prepare and present complex relations between symptoms and diseases, medical and treatment consequences and definitely has significant role in improving follow-up management of chronic heart failure (CHF) patients, increasing speed and accuracy in diagnosis and treatments; reducing costs, designing and implementation of clinical guidelines. The aim of this article is to describe intelligent data analysis methods in order to improve patient monitoring in follow and treatment of chronic heart failure patients as the best approach for CHF follow up management. Minimum data set (MDS) requirements for monitoring and follow up of CHF patient designed in checklist with six main parts. All CHF patients that discharged in 2013 from Tehran heart center have been selected. The MDS for monitoring CHF patient status were collected during 5 months in three different times of follow up. Gathered data was imported in RAPIDMINER 5 software. Modeling was based on decision trees methods such as C4.5, CHAID, ID3 and k-Nearest Neighbors algorithm (K-NN) with k=1. Final analysis was based on voting method. Decision trees and K-NN evaluate according to Cross-Validation. Creating and using standard terminologies and databases consistent with these terminologies help to meet the challenges related to data collection from various places and data application in intelligent data analysis. It should be noted that intelligent analysis of health data and intelligent system can never replace cardiologists. It can only act as a helpful tool for the cardiologist's decisions making.
An experimental and analytical investigation on the response of GR/EP composite I-frames
NASA Technical Reports Server (NTRS)
Moas, E., Jr.; Boitnott, R. L.; Griffin, O. H., Jr.
1991-01-01
Six-foot diameter, semicircular graphite/epoxy specimens representative of generic aircraft frames were loaded quasi-statically to determine their load response and failure mechanisms for large deflections that occur in an airplane crash. These frame-skin specimens consisted of a cylindrical skin section cocured with a semicircular I-frame. Various frame laminate stacking sequences and geometries were evaluated by statically loading the specimen until multiple failures occurred. Two analytical methods were compared for modeling the frame-skin specimens: a two-dimensional branched-shell finite element analysis and a one-dimensional, closed-form, curved beam solution derived using an energy method. Excellent correlation was obtained between experimental results and the finite element predictions of the linear response of the frames prior to the initial failure. The beam solution was used for rapid parameter and design studies, and was found to be stiff in comparison with the finite element analysis. The specimens were found to be useful for evaluating composite frame designs.
Estimation of submarine mass failure probability from a sequence of deposits with age dates
Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.
2013-01-01
The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.
NASA Astrophysics Data System (ADS)
Li, Jianfeng; Xiao, Mingqing; Liang, Yajun; Tang, Xilang; Li, Chao
2018-01-01
The solenoid valve is a kind of basic automation component applied widely. It’s significant to analyze and predict its degradation failure mechanism to improve the reliability of solenoid valve and do research on prolonging life. In this paper, a three-dimensional finite element analysis model of solenoid valve is established based on ANSYS Workbench software. A sequential coupling method used to calculate temperature filed and mechanical stress field of solenoid valve is put forward. The simulation result shows the sequential coupling method can calculate and analyze temperature and stress distribution of solenoid valve accurately, which has been verified through the accelerated life test. Kalman filtering algorithm is introduced to the data processing, which can effectively reduce measuring deviation and restore more accurate data information. Based on different driving current, a kind of failure mechanism which can easily cause the degradation of coils is obtained and an optimization design scheme of electro-insulating rubbers is also proposed. The high temperature generated by driving current and the thermal stress resulting from thermal expansion can easily cause the degradation of coil wires, which will decline the electrical resistance of coils and result in the eventual failure of solenoid valve. The method of finite element analysis can be applied to fault diagnosis and prognostic of various solenoid valves and improve the reliability of solenoid valve’s health management.
J series thruster isolator failure analysis
NASA Technical Reports Server (NTRS)
Campbell, J. W.; Bechtel, R. T.; Brophy, J. R.
1982-01-01
Three Hg propellant isolators (two cathode and one main) failed during testing in the Mission Profile Life Test. These failures involved contamination of the surface of the alumina insulating body which resulted in heating of the vaporizer by leakage current from the high voltage supply, with subsequent loss of propellant flow rate control. Failure analysis of the isolators showed the surface resistance was temperature dependent and that the alumina could be restored to its original insulating state by grit blasting the surface. The contaminant was identified as carbon and the most likely sources identified as ambient facility hydrocarbons, directed back-sputtered facility materials, and outgassing from organic insulating materials within the thruster envelope. Methods to eliminate contamination from each of these sources are described.
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
Availability Analysis of Dual Mode Systems
DOT National Transportation Integrated Search
1974-04-01
The analytical procedures presented define a method of evaluating the effects of failures in a complex dual-mode system based on a worst case steady-state analysis. The computed result is an availability figure of merit and not an absolute prediction...
Basic failure mechanisms in advanced composites
NASA Technical Reports Server (NTRS)
Mullin, J. V.; Mazzio, V. F.; Mehan, R. L.
1972-01-01
Failure mechanisms in carbon-epoxy composites are identified as a basis for more reliable prediction of the performance of these materials. The approach involves both the study of local fracture events in model specimens containing small groups of filaments and fractographic examination of high fiber content engineering composites. Emphasis is placed on the correlation of model specimen observations with gross fracture modes. The effects of fiber surface treatment, resin modification and fiber content are studied and acoustic emission methods are applied. Some effort is devoted to analysis of the failure process in composite/metal specimens.
Bonin, Christiani Decker Batista; dos Santos, Rafaella Zulianello; Ghisi, Gabriela Lima de Melo; Vieira, Ariany Marques; Amboni, Ricardo; Benetti, Magnus
2014-01-01
Background The lack of tools to measure heart failure patients' knowledge about their syndrome when participating in rehabilitation programs demonstrates the need for specific recommendations regarding the amount or content of information required. Objectives To develop and validate a questionnaire to assess heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. Methods The tool was developed based on the Coronary Artery Disease Education Questionnaire and applied to 96 patients with heart failure, with a mean age of 60.22 ± 11.6 years, 64% being men. Reproducibility was obtained via the intraclass correlation coefficient, using the test-retest method. Internal consistency was assessed by use of Cronbach's alpha, and construct validity, by use of exploratory factor analysis. Results The final version of the tool had 19 questions arranged in ten areas of importance for patient education. The proposed questionnaire had a clarity index of 8.94 ± 0.83. The intraclass correlation coefficient was 0.856, and Cronbach's alpha, 0.749. Factor analysis revealed five factors associated with the knowledge areas. Comparing the final scores with the characteristics of the population evidenced that low educational level and low income are significantly associated with low levels of knowledge. Conclusion The instrument has satisfactory clarity and validity indices, and can be used to assess the heart failure patients' knowledge about their syndrome when participating in cardiac rehabilitation programs. PMID:24652054
Failure Mode and Effect Analysis for Delivery of Lung Stereotactic Body Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perks, Julian R., E-mail: julian.perks@ucdmc.ucdavis.edu; Stanic, Sinisa; Stern, Robin L.
2012-07-15
Purpose: To improve the quality and safety of our practice of stereotactic body radiation therapy (SBRT), we analyzed the process following the failure mode and effects analysis (FMEA) method. Methods: The FMEA was performed by a multidisciplinary team. For each step in the SBRT delivery process, a potential failure occurrence was derived and three factors were assessed: the probability of each occurrence, the severity if the event occurs, and the probability of detection by the treatment team. A rank of 1 to 10 was assigned to each factor, and then the multiplied ranks yielded the relative risks (risk priority numbers).more » The failure modes with the highest risk priority numbers were then considered to implement process improvement measures. Results: A total of 28 occurrences were derived, of which nine events scored with significantly high risk priority numbers. The risk priority numbers of the highest ranked events ranged from 20 to 80. These included transcription errors of the stereotactic coordinates and machine failures. Conclusion: Several areas of our SBRT delivery were reconsidered in terms of process improvement, and safety measures, including treatment checklists and a surgical time-out, were added for our practice of gantry-based image-guided SBRT. This study serves as a guide for other users of SBRT to perform FMEA of their own practice.« less
EHR Improvement Using Incident Reports.
Teame, Tesfay; Stålhane, Tor; Nytrø, Øystein
2017-01-01
This paper discusses reactive improvement of clinical software using methods for incident analysis. We used the "Five Whys" method because we had only descriptive data and depended on a domain expert for the analysis. The analysis showed that there are two major root causes for EHR software failure, and that they are related to human and organizational errors. A main identified improvement is allocating more resources to system maintenance and user training.
Patel, Teresa; Fisher, Stanley P.
2016-01-01
Objective This study aimed to utilize failure modes and effects analysis (FMEA) to transform clinical insights into a risk mitigation plan for intrathecal (IT) drug delivery in pain management. Methods The FMEA methodology, which has been used for quality improvement, was adapted to assess risks (i.e., failure modes) associated with IT therapy. Ten experienced pain physicians scored 37 failure modes in the following categories: patient selection for therapy initiation (efficacy and safety concerns), patient safety during IT therapy, and product selection for IT therapy. Participants assigned severity, probability, and detection scores for each failure mode, from which a risk priority number (RPN) was calculated. Failure modes with the highest RPNs (i.e., most problematic) were discussed, and strategies were proposed to mitigate risks. Results Strategic discussions focused on 17 failure modes with the most severe outcomes, the highest probabilities of occurrence, and the most challenging detection. The topic of the highest‐ranked failure mode (RPN = 144) was manufactured monotherapy versus compounded combination products. Addressing failure modes associated with appropriate patient and product selection was predicted to be clinically important for the success of IT therapy. Conclusions The methodology of FMEA offers a systematic approach to prioritizing risks in a complex environment such as IT therapy. Unmet needs and information gaps are highlighted through the process. Risk mitigation and strategic planning to prevent and manage critical failure modes can contribute to therapeutic success. PMID:27477689
ERIC Educational Resources Information Center
Markle, Gail
2017-01-01
Undergraduate social science research methods courses tend to have higher than average rates of failure and withdrawal. Lack of success in these courses impedes students' progression through their degree programs and negatively impacts institutional retention and graduation rates. Grounded in adult learning theory, this mixed methods study…
Reliability Assessment for Low-cost Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Freeman, Paul Michael
Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.
A streamlined failure mode and effects analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie
Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and usedmore » to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.« less
Reliability-based management of buried pipelines considering external corrosion defects
NASA Astrophysics Data System (ADS)
Miran, Seyedeh Azadeh
Corrosion is one of the main deteriorating mechanisms that degrade the energy pipeline integrity, due to transferring corrosive fluid or gas and interacting with corrosive environment. Corrosion defects are usually detected by periodical inspections using in-line inspection (ILI) methods. In order to ensure pipeline safety, this study develops a cost-effective maintenance strategy that consists of three aspects: corrosion growth model development using ILI data, time-dependent performance evaluation, and optimal inspection interval determination. In particular, the proposed study is applied to a cathodic protected buried steel pipeline located in Mexico. First, time-dependent power-law formulation is adopted to probabilistically characterize growth of the maximum depth and length of the external corrosion defects. Dependency between defect depth and length are considered in the model development and generation of the corrosion defects over time is characterized by the homogenous Poisson process. The growth models unknown parameters are evaluated based on the ILI data through the Bayesian updating method with Markov Chain Monte Carlo (MCMC) simulation technique. The proposed corrosion growth models can be used when either matched or non-matched defects are available, and have ability to consider newly generated defects since last inspection. Results of this part of study show that both depth and length growth models can predict damage quantities reasonably well and a strong correlation between defect depth and length is found. Next, time-dependent system failure probabilities are evaluated using developed corrosion growth models considering prevailing uncertainties where three failure modes, namely small leak, large leak and rupture are considered. Performance of the pipeline is evaluated through failure probability per km (or called a sub-system) where each subsystem is considered as a series system of detected and newly generated defects within that sub-system. Sensitivity analysis is also performed to determine to which incorporated parameter(s) in the growth models reliability of the studied pipeline is most sensitive. The reliability analysis results suggest that newly generated defects should be considered in calculating failure probability, especially for prediction of long-term performance of the pipeline and also, impact of the statistical uncertainty in the model parameters is significant that should be considered in the reliability analysis. Finally, with the evaluated time-dependent failure probabilities, a life cycle-cost analysis is conducted to determine optimal inspection interval of studied pipeline. The expected total life-cycle costs consists construction cost and expected costs of inspections, repair, and failure. The repair is conducted when failure probability from any described failure mode exceeds pre-defined probability threshold after each inspection. Moreover, this study also investigates impact of repair threshold values and unit costs of inspection and failure on the expected total life-cycle cost and optimal inspection interval through a parametric study. The analysis suggests that a smaller inspection interval leads to higher inspection costs, but can lower failure cost and also repair cost is less significant compared to inspection and failure costs.
NASA Astrophysics Data System (ADS)
Rahman, P. A.
2018-05-01
This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.
NASA Technical Reports Server (NTRS)
Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary d.; Goldberg, Robert K.
2008-01-01
In previous work, the ballistic impact resistance of triaxial braided carbon/epoxy composites made with large flat tows (12k and 24k) was examined by impacting 2 X2 X0.125" composite panels with gelatin projectiles. Several high strength, intermediate modulus carbon fibers were used in combination with both untoughened and toughened matrix materials. A wide range of penetration thresholds were measured for the various fiber/matrix combinations. However, there was no clear relationship between the penetration threshold and the properties of the constituents. During some of these experiments high speed cameras were used to view the failure process, and full-field strain measurements were made to determine the strain at the onset of failure. However, these experiments provided only limited insight into the microscopic failure processes responsible for the wide range of impact resistance observed. In order to investigate potential microscopic failure processes in more detail, quasi-static tests were performed in tension, compression, and shear. Full-field strain measurement techniques were used to identify local regions of high strain resulting from microscopic failures. Microscopic failure events near the specimen surface, such as splitting of fiber bundles in surface plies, were easily identified. Subsurface damage, such as fiber fracture or fiber bundle splitting, could be identified by its effect on in-plane surface strains. Subsurface delamination could be detected as an out-of-plane deflection at the surface. Using this data, failure criteria could be established at the fiber tow level for use in analysis. An analytical formulation was developed to allow the microscopic failure criteria to be used in place of macroscopic properties as input to simulations performed using the commercial explicit finite element code, LS-DYNA. The test methods developed to investigate microscopic failure will be presented along with methods for determining local failure criteria that can be used in analysis. Results of simulations performed using LS-DYNA will be presented to illustrate the capabilities and limitations for simulating failure during quasi-static deformation and during ballistic impact of large unit cell size triaxial braid composites.
Altstein, L.; Li, G.
2012-01-01
Summary This paper studies a semiparametric accelerated failure time mixture model for estimation of a biological treatment effect on a latent subgroup of interest with a time-to-event outcome in randomized clinical trials. Latency is induced because membership is observable in one arm of the trial and unidentified in the other. This method is useful in randomized clinical trials with all-or-none noncompliance when patients in the control arm have no access to active treatment and in, for example, oncology trials when a biopsy used to identify the latent subgroup is performed only on subjects randomized to active treatment. We derive a computational method to estimate model parameters by iterating between an expectation step and a weighted Buckley-James optimization step. The bootstrap method is used for variance estimation, and the performance of our method is corroborated in simulation. We illustrate our method through an analysis of a multicenter selective lymphadenectomy trial for melanoma. PMID:23383608
Wright, David A; Nam, Diane; Whyne, Cari M
2012-08-31
In attempting to develop non-invasive image based measures for the determination of the biomechanical integrity of healing fractures, traditional μCT based measurements have been limited. This study presents the development and evaluation of a tool for assessment of fracture callus mechanical properties through determination of the geometric characteristics of the fracture callus, specifically along the surface of failure identified during destructive mechanical testing. Fractures were created in tibias of ten male mice and subjected to μCT imaging and biomechanical torsion testing. Failure surface analysis, along with previously described image based measures was calculated using the μCT image data, and correlated with mechanical strength and stiffness. Three-dimensional measures along the surface of failure, specifically the surface area and torsional rigidity of bone, were shown to be significantly correlating with mechanical strength and stiffness. It was also shown that surface area of bone along the failure surface exhibits stronger correlations with both strength and stiffness than measures of average and minimum torsional rigidity of the entire callus. Failure surfaces observed in this study were generally oriented at 45° to the long axis of the bone, and were not contained exclusively within the callus. This work represents a proof of concept study, and shows the potential utility of failure surface analysis in the assessment of fracture callus stability. Copyright © 2012 Elsevier Ltd. All rights reserved.
Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia
2015-04-26
The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less
Common cause evaluations in applied risk analysis of nuclear power plants. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taniguchi, T.; Ligon, D.; Stamatelatos, M.
1983-04-01
Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less
Wagner, Mathilde; Corcuera-Solano, Idoia; Lo, Grace; Esses, Steven; Liao, Joseph; Besa, Cecilia; Chen, Nelson; Abraham, Ginu; Fung, Maggie; Babb, James S; Ehman, Richard L; Taouli, Bachir
2017-08-01
Purpose To assess the determinants of technical failure of magnetic resonance (MR) elastography of the liver in a large single-center study. Materials and Methods This retrospective study was approved by the institutional review board. Seven hundred eighty-one MR elastography examinations performed in 691 consecutive patients (mean age, 58 years; male patients, 434 [62.8%]) in a single center between June 2013 and August 2014 were retrospectively evaluated. MR elastography was performed at 3.0 T (n = 443) or 1.5 T (n = 338) by using a gradient-recalled-echo pulse sequence. MR elastography and anatomic image analysis were performed by two observers. Additional observers measured liver T2* and fat fraction. Technical failure was defined as no pixel value with a confidence index higher than 95% and/or no apparent shear waves imaged. Logistic regression analysis was performed to assess potential predictive factors of technical failure of MR elastography. Results The technical failure rate of MR elastography at 1.5 T was 3.5% (12 of 338), while it was higher, 15.3% (68 of 443), at 3.0 T. On the basis of univariate analysis, body mass index, liver iron deposition, massive ascites, use of 3.0 T, presence of cirrhosis, and alcoholic liver disease were all significantly associated with failure of MR elastography (P < .004); but on the basis of multivariable analysis, only body mass index, liver iron deposition, massive ascites, and use of 3.0 T were significantly associated with failure of MR elastography (P < .004). Conclusion The technical failure rate of MR elastography with a gradient-recalled-echo pulse sequence was low at 1.5 T but substantially higher at 3.0 T. Massive ascites, iron deposition, and high body mass index were additional independent factors associated with failure of MR elastography of the liver with a two-dimensional gradient-recalled-echo pulse sequence. © RSNA, 2017.
Shielding of substations against direct lightning strokes by shield wires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhuri, P.
1994-01-01
A new analysis for shielding outdoor substations against direct lightning strokes by shield wires is proposed. The basic assumption of this proposed method is that any lightning stroke which penetrates the shields will cause damage. The second assumption is that a certain level of risk of failure must be accepted, such as one or two failures per 100 years. The proposed method, using electrogeometric model, was applied to design shield wires for two outdoor substations: (1) 161-kV/69-kV station, and (2) 500-kV/161-kV station. The results of the proposed method were also compared with the shielding data of two other substations.
Cantilever testing of sintered-silver interconnects
Wereszczak, Andrew A.; Chen, Branndon R.; Jadaan, Osama M.; ...
2017-10-19
Cantilever testing is an underutilized test method from which results and interpretations promote greater understanding of the tensile and shear failure responses of interconnects, metallizations, or bonded joints. The use and analysis of this method were pursued through the mechanical testing of sintered-silver interconnects that joined Ni/Au-plated copper pillars or Ti/Ni/Ag-plated silicon pillars to Ag-plated direct bonded copper substrates. Sintered-silver was chosen as the interconnect test medium because of its high electrical and thermal conductivities and high-temperature capability—attractive characteristics for a candidate interconnect in power electronic components and other devices. Deep beam theory was used to improve upon the estimationsmore » of the tensile and shear stresses calculated from classical beam theory. The failure stresses of the sintered-silver interconnects were observed to be dependent on test-condition and test-material-system. In conclusion, the experimental simplicity of cantilever testing, and the ability to analytically calculate tensile and shear stresses at failure, result in it being an attractive mechanical test method to evaluate the failure response of interconnects.« less
Cantilever testing of sintered-silver interconnects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A.; Chen, Branndon R.; Jadaan, Osama M.
Cantilever testing is an underutilized test method from which results and interpretations promote greater understanding of the tensile and shear failure responses of interconnects, metallizations, or bonded joints. The use and analysis of this method were pursued through the mechanical testing of sintered-silver interconnects that joined Ni/Au-plated copper pillars or Ti/Ni/Ag-plated silicon pillars to Ag-plated direct bonded copper substrates. Sintered-silver was chosen as the interconnect test medium because of its high electrical and thermal conductivities and high-temperature capability—attractive characteristics for a candidate interconnect in power electronic components and other devices. Deep beam theory was used to improve upon the estimationsmore » of the tensile and shear stresses calculated from classical beam theory. The failure stresses of the sintered-silver interconnects were observed to be dependent on test-condition and test-material-system. In conclusion, the experimental simplicity of cantilever testing, and the ability to analytically calculate tensile and shear stresses at failure, result in it being an attractive mechanical test method to evaluate the failure response of interconnects.« less
Ventilatory support in critically ill hematology patients with respiratory failure
2012-01-01
Introduction Hematology patients admitted to the ICU frequently experience respiratory failure and require mechanical ventilation. Noninvasive mechanical ventilation (NIMV) may decrease the risk of intubation, but NIMV failure poses its own risks. Methods To establish the impact of ventilatory management and NIMV failure on outcome, data from a prospective, multicenter, observational study were analyzed. All hematology patients admitted to one of the 34 participating ICUs in a 17-month period were followed up. Data on demographics, diagnosis, severity, organ failure, and supportive therapies were recorded. A logistic regression analysis was done to evaluate the risk factors associated with death and NIVM failure. Results Of 450 patients, 300 required ventilatory support. A diagnosis of congestive heart failure and the initial use of NIMV significantly improved survival, whereas APACHE II score, allogeneic transplantation, and NIMV failure increased the risk of death. The risk factors associated with NIMV success were age, congestive heart failure, and bacteremia. Patients with NIMV failure experienced a more severe respiratory impairment than did those electively intubated. Conclusions NIMV improves the outcome of hematology patients with respiratory insufficiency, but NIMV failure may have the opposite effect. A careful selection of patients with rapidly reversible causes of respiratory failure may increase NIMV success. PMID:22827955
Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erin V.
2007-01-01
A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for a 2-bearing shaft assembly in each body flap actuator established a reliability level of 99.6 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.
Fuzzy Bayesian Network-Bow-Tie Analysis of Gas Leakage during Biomass Gasification
Yan, Fang; Xu, Kaili; Yao, Xiwen; Li, Yang
2016-01-01
Biomass gasification technology has been rapidly developed recently. But fire and poisoning accidents caused by gas leakage restrict the development and promotion of biomass gasification. Therefore, probabilistic safety assessment (PSA) is necessary for biomass gasification system. Subsequently, Bayesian network-bow-tie (BN-bow-tie) analysis was proposed by mapping bow-tie analysis into Bayesian network (BN). Causes of gas leakage and the accidents triggered by gas leakage can be obtained by bow-tie analysis, and BN was used to confirm the critical nodes of accidents by introducing corresponding three importance measures. Meanwhile, certain occurrence probability of failure was needed in PSA. In view of the insufficient failure data of biomass gasification, the occurrence probability of failure which cannot be obtained from standard reliability data sources was confirmed by fuzzy methods based on expert judgment. An improved approach considered expert weighting to aggregate fuzzy numbers included triangular and trapezoidal numbers was proposed, and the occurrence probability of failure was obtained. Finally, safety measures were indicated based on the obtained critical nodes. The theoretical occurrence probabilities in one year of gas leakage and the accidents caused by it were reduced to 1/10.3 of the original values by these safety measures. PMID:27463975
Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings
NASA Technical Reports Server (NTRS)
Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erwin V.
2008-01-01
A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for the four actuators on one shuttle, each with a 2-bearing shaft assembly, established a reliability level of 96.9 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.
NASA Technical Reports Server (NTRS)
Johnson, W. S. (Editor)
1989-01-01
The present conference discusses the tension and compression testing of MMCs, the measurement of advanced composites' thermal expansion, plasticity theory for fiber-reinforced composites, a deformation analysis of boron/aluminum specimens by moire interferometry, strength prediction methods for MMCs, and the analysis of notched MMCs under tensile loading. Also discussed are techniques for the mechanical and thermal testing of Ti3Al/SCS-6 MMCs, damage initiation and growth in fiber-reinforced MMCs, the shear testing of MMCs, the crack growth and fracture of continuous fiber-reinforced MMCs in view of analytical and experimental results, and MMC fiber-matrix interface failures.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Kahrass, Hannes; Strech, Daniel; Mertz, Marcel
2016-01-01
Background When treating patients with kidney failure, unavoidable ethical issues often arise. Current clinical practice guidelines some of them, but lack comprehensive information about the full range of relevant ethical issues in kidney failure. A systematic literature review of such ethical issues supports medical professionalism in nephrology, and offers a solid evidential base for efforts that aim to improve ethical conduct in health care. Aim To identify the full spectrum of clinical ethical issues that can arise for patients with kidney failure in a systematic and transparent manner. Method A systematic review in Medline (publications in English or German between 2000 and 2014) and Google Books (with no restrictions) was conducted. Ethical issues were identified by qualitative text analysis and normative analysis. Results The literature review retrieved 106 references that together mentioned 27 ethical issues in clinical care of kidney failure. This set of ethical issues was structured into a matrix consisting of seven major categories and further first and second-order categories. Conclusions The systematically-derived matrix helps raise awareness and understanding of the complexity of ethical issues in kidney failure. It can be used to identify ethical issues that should be addressed in specific training programs for clinicians, clinical practice guidelines, or other types of policies dealing with kidney failure. PMID:26938863
Patil, Vijay M.; Noronha, Vanita; Joshi, Amit; Pinninti, Rakesh; Dhumal, Sachin; Bhattacharjee, Atanu; Prabhash, Kumar
2015-01-01
Purpose: Oral cancer patients with platinum-resistant disease and or early failures have limited treatment options. This analysis was planned to study the efficacy of metronomic chemotherapy in this group of patients. Materials and Methods: This was a retrospective analysis of oral cancer patients who had squamous cell carcinoma and had an early failure and/or platinum-insensitive failure. Early failure was defined as a failure either within 1-month of adjuvant radiotherapy or within 6 months of chemoradiation (CTRT). A sample size of 100 patients was selected for this study. If ≥39 of 100 patients would have survived at 6 months with metronomic chemotherapy, then additional studies would be warranted. Results: The ECOG PS was 0-1 in 92 patients and 2 in 8 patients. The subsite of primary was buccal mucosa in 38 patients (38%), anterior two-third tongue (oral tongue) in 51 patients (51%), and alveolus in 11 patients (11%). The median estimated overall survival was 110 days (95% confidence interval [CI]: 85-134 days). The proportion of patients surviving at 6 months was 26.4% (95% CI: 17.9-35.6). Conclusion: Metronomic combination of methotrexate and celecoxib failed to meet its prespecified efficacy limit and should not be used in these patients as routine. PMID:26855524
Torque Limits for Fasteners in Composites
NASA Technical Reports Server (NTRS)
Zhao, Yi
2002-01-01
The two major classes of laminate joints are bonded and bolted. Often the two classes are combined as bonded-bolted joints. Several characteristics of fiber reinforced composite materials render them more susceptible to joint problems than conventional metals. These characteristics include weakness in in-plane shear, transverse tension/compression, interlaminar shear, and bearing strength relative to the strength and stiffness in the fiber direction. Studies on bolted joints of composite materials have been focused on joining assembly subject to in-plane loads. Modes of failure under these loading conditions are net-tension failure, cleavage tension failure, shear-out failure, bearing failure, etc. Although the studies of torque load can be found in literature, they mainly discussed the effect of the torque load on in-plane strength. Existing methods for calculating torque limit for a mechanical fastener do not consider connecting members. The concern that a composite member could be crushed by a preload inspired the initiation of this study. The purpose is to develop a fundamental knowledge base on how to determine a torque limit when a composite member is taken into account. Two simplified analytical models were used: a stress failure analysis model based on maximum stress criterion, and a strain failure analysis model based on maximum strain criterion.
Failure mode and effects analysis drastically reduced potential risks in clinical trial conduct
Baik, Jungmi; Kim, Hyunjung; Kim, Rachel
2017-01-01
Background Failure mode and effects analysis (FMEA) is a risk management tool to proactively identify and assess the causes and effects of potential failures in a system, thereby preventing them from happening. The objective of this study was to evaluate effectiveness of FMEA applied to an academic clinical trial center in a tertiary care setting. Methods A multidisciplinary FMEA focus group at the Seoul National University Hospital Clinical Trials Center selected 6 core clinical trial processes, for which potential failure modes were identified and their risk priority number (RPN) was assessed. Remedial action plans for high-risk failure modes (RPN >160) were devised and a follow-up RPN scoring was conducted a year later. Results A total of 114 failure modes were identified with an RPN score ranging 3–378, which was mainly driven by the severity score. Fourteen failure modes were of high risk, 11 of which were addressed by remedial actions. Rescoring showed a dramatic improvement attributed to reduction in the occurrence and detection scores by >3 and >2 points, respectively. Conclusions FMEA is a powerful tool to improve quality in clinical trials. The Seoul National University Hospital Clinical Trials Center is expanding its FMEA capability to other core clinical trial processes. PMID:29089745
Fractography, NDE, and fracture mechanics applications in failure analysis studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morin, C.R.; Shipley, R.J.; Wilkinson, J.A.
1994-10-01
While identification of the precise mode of a failure can lead logically to the underlying cause, a thorough failure investigation requires much more than just the identification of a specific metallurgical mechanism, for example, fatigue, creep, stress corrosion cracking, etc. Failures involving fracture provide good illustrations of this concept. An initial step in characterizing fracture surfaces is often the identification of an origin or origins. However, the analysis should not stop there. If the origin is associated with a discontinuity, the manner in which it was formed must also be addressed. The stresses that would have existed at the originmore » must be determined and compared with material properties to determine whether or not a crack should have initiated and propagated during normal operation. Many critical components are inspected throughout their lives by nondestructive methods. When a crack progresses to failure, its nondetection at earlier inspections must also be understood. Careful study of the fracture surface combined with crack growth analysis based on fracture mechanics can provide an estimate of the crack length at the times of previous inspections. An important issue often overlooked in such studies is how processing of parts during manufacture or rework affects the probability of detection of such cracks. The ultimate goal is to understand thoroughly the progression of the failure, to understand the root cause(s), and to design appropriate corrective action(s) to minimize recurrence.« less
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-01-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-08-01
Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
EEMD-based multiscale ICA method for slewing bearing fault detection and diagnosis
NASA Astrophysics Data System (ADS)
Žvokelj, Matej; Zupan, Samo; Prebil, Ivan
2016-05-01
A novel multivariate and multiscale statistical process monitoring method is proposed with the aim of detecting incipient failures in large slewing bearings, where subjective influence plays a minor role. The proposed method integrates the strengths of the Independent Component Analysis (ICA) multivariate monitoring approach with the benefits of Ensemble Empirical Mode Decomposition (EEMD), which adaptively decomposes signals into different time scales and can thus cope with multiscale system dynamics. The method, which was named EEMD-based multiscale ICA (EEMD-MSICA), not only enables bearing fault detection but also offers a mechanism of multivariate signal denoising and, in combination with the Envelope Analysis (EA), a diagnostic tool. The multiscale nature of the proposed approach makes the method convenient to cope with data which emanate from bearings in complex real-world rotating machinery and frequently represent the cumulative effect of many underlying phenomena occupying different regions in the time-frequency plane. The efficiency of the proposed method was tested on simulated as well as real vibration and Acoustic Emission (AE) signals obtained through conducting an accelerated run-to-failure lifetime experiment on a purpose-built laboratory slewing bearing test stand. The ability to detect and locate the early-stage rolling-sliding contact fatigue failure of the bearing indicates that AE and vibration signals carry sufficient information on the bearing condition and that the developed EEMD-MSICA method is able to effectively extract it, thereby representing a reliable bearing fault detection and diagnosis strategy.
DATMAN: A reliability data analysis program using Bayesian updating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, M.; Feltus, M.A.
1996-12-31
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less
Dai, Dao-Fu; Hsieh, Edward J.; Liu, Yonggang; Chen, Tony; Beyer, Richard P.; Chin, Michael T.; MacCoss, Michael J.; Rabinovitch, Peter S.
2012-01-01
Aims We investigate the role of mitochondrial oxidative stress in mitochondrial proteome remodelling using mouse models of heart failure induced by pressure overload. Methods and results We demonstrate that mice overexpressing catalase targeted to mitochondria (mCAT) attenuate pressure overload-induced heart failure. An improved method of label-free unbiased analysis of the mitochondrial proteome was applied to the mouse model of heart failure induced by transverse aortic constriction (TAC). A total of 425 mitochondrial proteins were compared between wild-type and mCAT mice receiving TAC or sham surgery. The changes in the mitochondrial proteome in heart failure included decreased abundance of proteins involved in fatty acid metabolism, an increased abundance of proteins in glycolysis, apoptosis, mitochondrial unfolded protein response and proteolysis, transcription and translational control, and developmental processes as well as responses to stimuli. Overexpression of mCAT better preserved proteins involved in fatty acid metabolism and attenuated the increases in apoptotic and proteolytic enzymes. Interestingly, gene ontology analysis also showed that monosaccharide metabolic processes and protein folding/proteolysis were only overrepresented in mCAT but not in wild-type mice in response to TAC. Conclusion This is the first study to demonstrate that scavenging mitochondrial reactive oxygen species (ROS) by mCAT not only attenuates most of the mitochondrial proteome changes in heart failure, but also induces a subset of unique alterations. These changes represent processes that are adaptive to the increased work and metabolic requirements of pressure overload, but which are normally inhibited by overproduction of mitochondrial ROS. PMID:22012956
Analysis of micro-failure behaviors in artificial muscles based on fishing line and sewing thread
NASA Astrophysics Data System (ADS)
Xu, J. B.; Cheng, K. F.; Tu, S. L.; He, X. M.; Ma, C.; Jin, Y. Z.; Kang, X. N.; Sun, T.; Zhang, Y.
2017-06-01
The aim of the present study was to discuss a new and effective method for testing artificial muscles based on micro-failure behaviors analysis. Thermo-mechanical actuators based on fishing line and sewing thread, also, the capability of responding to ambient temperature variations producing a large amount of shrinkage ratio of a resulting variation in longitudinal length. The minimum micro-failure value is 0.02μm and the maximum value is 1.72μm with nylon twist pattern. The discovery of an innovative effective testing of artificial muscles based on polymeric fibers specimens on micro-failure, rupture, slippage, etc. This research finds out a micro-failure behavior analysis of thermo-mechanical actuators based on fishing line and sewing thread. The specimens show large deformations when heated together with warping performance in terms of shrinkage of energy and densities. With the purpose of providing useful analysis data for the further technology applications, we attempt micrometre-sized artificial muscles which were also tested was readily accessible and also can be applied to other polymeric fibers. Effective use of this technique achievement relies on rotate speed, temperature and tensile direction. The results of the tensile testing experiments were outstanding with respect to some important issues related to the response of micro-structure, twisted polymeric fibers and shrinkage ratio.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Stacy; English, Shawn; Briggs, Timothy
Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less
Stability analysis of chalk sea cliffs using UAV photogrammetry
NASA Astrophysics Data System (ADS)
Barlow, John; Gilham, Jamie
2017-04-01
Cliff erosion and instability poses a significant hazard to communities and infrastructure located is coastal areas. We use point cloud and spectral data derived from close range digital photogrammetry to assess the stability of chalk sea cliffs located at Telscombe, UK. Data captured from an unmanned aerial vehicle (UAV) were used to generate dense point clouds for a 712 m section of cliff face which ranges from 20 to 49 m in height. Generated models fitted our ground control network within a standard error of 0.03 m. Structural features such as joints, bedding planes, and faults were manually mapped and are consistent with results from other studies that have been conducted using direct measurement in the field. Kinematic analysis of these data was used to identify the primary modes of failure at the site. Our results indicate that wedge failure is by far the most likely mode of slope instability. An analysis of sequential surveys taken from the summer of 2016 to the winter of 2017 indicate several large failures have occurred at the site. We establish the volume of failure through change detection between sequential data sets and use back analysis to determine the strength of shear surfaces for each failure. Our results show that data capture through UAV photogrammetry can provide useful information for slope stability analysis over long sections of cliff. The use of this technology offers significant benefits in equipment costs and field time over existing methods.
Analysis of asteroid (216) Kleopatra using dynamical and structural constraints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirabayashi, Masatoshi; Scheeres, Daniel J., E-mail: masatoshi.hirabayashi@colorado.edu
This paper evaluates a dynamically and structurally stable size for Asteroid (216) Kleopatra. In particular, we investigate two different failure modes: material shedding from the surface and structural failure of the internal body. We construct zero-velocity curves in the vicinity of this asteroid to determine surface shedding, while we utilize a limit analysis to calculate the lower and upper bounds of structural failure under the zero-cohesion assumption. Surface shedding does not occur at the current spin period (5.385 hr) and cannot directly initiate the formation of the satellites. On the other hand, this body may be close to structural failure;more » in particular, the neck may be situated near a plastic state. In addition, the neck's sensitivity to structural failure changes as the body size varies. We conclude that plastic deformation has probably occurred around the neck part in the past. If the true size of this body is established through additional measurements, this method will provide strong constraints on the current friction angle for the body.« less
Bourassa, Dominic; Gauthier, François; Abdul-Nour, Georges
2016-01-01
Accidental events in manufacturing industries can be caused by many factors, including work methods, lack of training, equipment design, maintenance and reliability. This study is aimed at determining the contribution of failures of commonly used industrial equipment, such as machines, tools and material handling equipment, to the chain of causality of industrial accidents and incidents. Based on a case study which aimed at the analysis of an existing pulp and paper company's accident database, this paper examines the number, type and gravity of the failures involved in these events and their causes. Results from this study show that equipment failures had a major effect on the number and severity of accidents accounted for in the database: 272 out of 773 accidental events were related to equipment failure, where 13 of them had direct human consequences. Failures that contributed directly or indirectly to these events are analyzed.
One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald Laurids Boring, PhD
2014-09-01
In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally,more » both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.« less
Lithographic chip identification: meeting the failure analysis challenge
NASA Astrophysics Data System (ADS)
Perkins, Lynn; Riddell, Kevin G.; Flack, Warren W.
1992-06-01
This paper describes a novel method using stepper photolithography to uniquely identify individual chips for permanent traceability. A commercially available 1X stepper is used to mark chips with an identifier or `serial number' which can be encoded with relevant information for the integrated circuit manufacturer. The permanent identification of individual chips can improve current methods of quality control, failure analysis, and inventory control. The need for this technology is escalating as manufacturers seek to provide six sigma quality control for their products and trace fabrication problems to their source. This need is especially acute for parts that fail after packaging and are returned to the manufacturer for analysis. Using this novel approach, failure analysis data can be tied back to a particular batch, wafer, or even a position within a wafer. Process control can be enhanced by identifying the root cause of chip failures. Chip identification also addresses manufacturers concerns with increasing incidences of chip theft. Since chips currently carry no identification other than the manufacturer's name and part number, recovery efforts are hampered by the inability to determine the sales history of a specific packaged chip. A definitive identifier or serial number for each chip would address this concern. The results of chip identification (patent pending) are easily viewed through a low power microscope. Batch number, wafer number, exposure step, and chip location within the exposure step can be recorded, as can dates and other items of interest. An explanation of the chip identification procedure and processing requirements are described. Experimental testing and results are presented, and potential applications are discussed.
Missau, Taiane; De Carlo Bello, Mariana; Michelon, Carina; Mastella Lang, Pauline; Kalil Pereira, Gabriel; Baldissara, Paolo; Valandro, Luiz Felipe; Souza Bier, Carlos Alexandre; Pivetta Rippe, Marília
2017-12-01
This study evaluated the effects of endodontic treatment and retreatment on the fatigue failure load, numbers of cycles for failure, and survival rates of canine teeth. Sixty extracted canine teeth, each with a single root canal, were selected and randomly divided into 4 groups (n = 15): untreated, teeth without endodontic intervention; prepared, teeth subjected only to rotary instrumentation; filled, teeth receiving complete endodontic treatment; and retreated, teeth retreated endodontically. After the different endodontic interventions, the specimens were subjected to fatigue testing by the stepwise method: 200 N (× 5000 load pulses), 300 N, 400 N, 500 N, 600 N, 800 N, and 900 N at a maximum of 30,000 load pulses each or the occurrence of fracture. Data from load to failure and numbers of cycles for fracture were recorded and subjected to Kaplan-Meier and Log Rank tests (P < .05), in addition to Weibull analysis. The fractures of the specimens were classified as repairable or catastrophic. The retreated, filled, and untreated groups presented statistically significantly higher fatigue failure loads and numbers of cycles for failure than did the prepared group. Weibull analysis showed no statistically significant difference among the treatments for characteristic load to failure and characteristic number of cycles for failure, although, for number of cycles, a higher Weibull modulus was observed in filled and retreated conditions. The predominant mode of failure was catastrophic. Teeth subjected to complete endodontic treatment and retreatment behaved similarly in terms of fatigue failure load and number of cycles to failure when compared with untreated teeth. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Microseismic Signature of Magma Failure: Testing Failure Forecast in Heterogeneous Material
NASA Astrophysics Data System (ADS)
Vasseur, J.; Lavallee, Y.; Hess, K.; Wassermann, J. M.; Dingwell, D. B.
2012-12-01
Volcanoes exhibit a range of seismic precursors prior to eruptions. This range of signals derive from different processes, which if quantified, may tell us when and how the volcano will erupt: effusively or explosively. This quantification can be performed in laboratory. Here we investigated the signals associated with the deformation and failure of single-phase silicate liquids compare to mutli-phase magmas containing pores and crystals as heterogeneities. For the past decades, magmas have been simplified as viscoelastic fluids with grossly predictable failure, following an analysis of the stress and strain rate conditions in volcanic conduits. Yet it is clear that the way magmas fail is not unique and evidences increasingly illustrate the role of heterogeneities in the process of magmatic fragmentation. In such multi-phase magmas, failure cannot be predicted using current rheological laws. Microseismicity, as detected in the laboratory by analogous Acoustic Emission (AE), can be used to monitor fracture initiation and propagation, and thus provides invaluable information to characterise the process of brittle failure underlying explosive eruptions. Tri-axial press experiments on different synthetised and natural glass samples have been performed to investigate the acoustic signature of failure. We observed that the failure of single-phase liquids occurs without much strain and is preceded by the constant nucleation, propagation and coalescence of cracks as demonstrated by the monitored AE. In contrast, the failure of multi-phase magmas depends on the applied stress and is strain dependent. The path dependence of magma failure is nonetheless accompanied by supra exponential acceleration in released AEs. Analysis of the released AEs following material Failure Forecast Method (FFM) suggests that the predicability of failure is enhanced by the presence of heterogeneities in magmas. We discuss our observations in terms of volcanic scenarios.
Fault detection and fault tolerance in robotics
NASA Technical Reports Server (NTRS)
Visinsky, Monica; Walker, Ian D.; Cavallaro, Joseph R.
1992-01-01
Robots are used in inaccessible or hazardous environments in order to alleviate some of the time, cost and risk involved in preparing men to endure these conditions. In order to perform their expected tasks, the robots are often quite complex, thus increasing their potential for failures. If men must be sent into these environments to repair each component failure in the robot, the advantages of using the robot are quickly lost. Fault tolerant robots are needed which can effectively cope with failures and continue their tasks until repairs can be realistically scheduled. Before fault tolerant capabilities can be created, methods of detecting and pinpointing failures must be perfected. This paper develops a basic fault tree analysis of a robot in order to obtain a better understanding of where failures can occur and how they contribute to other failures in the robot. The resulting failure flow chart can also be used to analyze the resiliency of the robot in the presence of specific faults. By simulating robot failures and fault detection schemes, the problems involved in detecting failures for robots are explored in more depth.
Pramanik, Brahmananda; Tadepalli, Tezeswi; Mantena, P. Raju
2012-01-01
In this study, the fractal dimensions of failure surfaces of vinyl ester based nanocomposites are estimated using two classical methods, Vertical Section Method (VSM) and Slit Island Method (SIM), based on the processing of 3D digital microscopic images. Self-affine fractal geometry has been observed in the experimentally obtained failure surfaces of graphite platelet reinforced nanocomposites subjected to quasi-static uniaxial tensile and low velocity punch-shear loading. Fracture energy and fracture toughness are estimated analytically from the surface fractal dimensionality. Sensitivity studies show an exponential dependency of fracture energy and fracture toughness on the fractal dimensionality. Contribution of fracture energy to the total energy absorption of these nanoparticle reinforced composites is demonstrated. For the graphite platelet reinforced nanocomposites investigated, surface fractal analysis has depicted the probable ductile or brittle fracture propagation mechanism, depending upon the rate of loading. PMID:28817017
Markov modeling and reliability analysis of urea synthesis system of a fertilizer plant
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram; Garg, Tarun Kr.
2015-12-01
This paper deals with the Markov modeling and reliability analysis of urea synthesis system of a fertilizer plant. This system was modeled using Markov birth-death process with the assumption that the failure and repair rates of each subsystem follow exponential distribution. The first-order Chapman-Kolmogorov differential equations are developed with the use of mnemonic rule and these equations are solved with Runga-Kutta fourth-order method. The long-run availability, reliability and mean time between failures are computed for various choices of failure and repair rates of subsystems of the system. The findings of the paper are discussed with the plant personnel to adopt and practice suitable maintenance policies/strategies to enhance the performance of the urea synthesis system of the fertilizer plant.
Research on Application of FMECA in Missile Equipment Maintenance Decision
NASA Astrophysics Data System (ADS)
Kun, Wang
2018-03-01
Fault mode effects and criticality analysis (FMECA) is a method widely used in engineering. Studying the application of FMEA technology in military equipment maintenance decision-making, can help us build a better equipment maintenance support system, and increase the using efficiency of weapons and equipment. Through Failure Modes, Effects and Criticality Analysis (FMECA) of equipment, known and potential failure modes and their causes are found out, and the influence on the equipment performance, operation success, personnel security are determined. Furthermore, according to the synthetical effects of the severity of effects and the failure probability, possible measures for prevention and correction are put forward. Through replacing or adjusting the corresponding parts, corresponding maintenance strategy is decided for preventive maintenance of equipment, which helps improve the equipment reliability.
Integrating FMEA in a Model-Driven Methodology
NASA Astrophysics Data System (ADS)
Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno
2016-08-01
Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.
Analytical Prediction of Damage Growth in Notched Composite Panels Loaded in Axial Compression
NASA Technical Reports Server (NTRS)
Ambur, Damodar R.; McGowan, David M.; Davila, Carlos G.
1999-01-01
A progressive failure analysis method based on shell elements is developed for the computation of damage initiation and growth in stiffened thick-skin stitched graphite-epoxy panels loaded in axial compression. The analysis method involves a step-by-step simulation of material degradation based on ply-level failure mechanisms. High computational efficiency is derived from the use of superposed layers of shell elements to model each ply orientation in the laminate. Multiple integration points through the thickness are used to obtain the correct bending effects through the thickness without the need for ply-by-ply evaluations of the state of the material. The analysis results are compared with experimental results for three stiffened panels with notches oriented at 0, 15 and 30 degrees to the panel width dimension. A parametric study is performed to investigate the damage growth retardation characteristics of the Kevlar stitch lines in the pan
Effect of Geometrical Imperfection on Buckling Failure of ITER VVPSS Tank
NASA Astrophysics Data System (ADS)
Jha, Saroj Kumar; Gupta, Girish Kumar; Pandey, Manish Kumar; Bhattacharya, Avik; Jogi, Gaurav; Bhardwaj, Anil Kumar
2017-04-01
The ‘Vacuum Vessel Pressure Suppression System’ (VVPSS) is part of ITER machine, which is designed to protect the ITER Vacuum Vessel and its connected systems, from an over-pressure situation. It is comprised of a partially evacuated tank of stainless steel approximately 46 m long and 6 m in diameter and thickness 30 mm. It is to hold approximately 675 tonnes of water at room temperature to condense the steam resulting from the adverse water leakage into the Vacuum Vessel chamber. For any vacuum vessel, geometrical imperfection has significant effect on buckling failure and structural integrity. Major geometrical imperfection in VVPSS tank depends on form tolerances. To study the effect of geometrical imperfection on buckling failure of VVPSS tank, finite element analysis (FEA) has been performed in line with ASME section VIII division 2 part 5 [1], ‘design by analysis method’. Linear buckling analysis has been performed to get the buckled shape and displacement. Geometrical imperfection due to form tolerance is incorporated in FEA model of VVPSS tank by scaling the resulted buckled shape by a factor ‘60’. This buckled shape model is used as input geometry for plastic collapse and buckling failure assessment. Plastic collapse and buckling failure of VVPSS tank has been assessed by using the elastic-plastic analysis method. This analysis has been performed for different values of form tolerance. The results of analysis show that displacement and load proportionality factor (LPF) vary inversely with form tolerance. For higher values of form tolerance LPF reduces significantly with high values of displacement.
Pottecher, Pierre; Engelke, Klaus; Duchemin, Laure; Museyko, Oleg; Moser, Thomas; Mitton, David; Vicaut, Eric; Adams, Judith; Skalli, Wafa; Laredo, Jean Denis; Bousson, Valérie
2016-09-01
Purpose To evaluate the performance of three imaging methods (radiography, dual-energy x-ray absorptiometry [DXA], and quantitative computed tomography [CT]) and that of a numerical analysis with finite element modeling (FEM) in the prediction of failure load of the proximal femur and to identify the best densitometric or geometric predictors of hip failure load. Materials and Methods Institutional review board approval was obtained. A total of 40 pairs of excised cadaver femurs (mean patient age at time of death, 82 years ± 12 [standard deviation]) were examined with (a) radiography to measure geometric parameters (lengths, angles, and cortical thicknesses), (b) DXA (reference standard) to determine areal bone mineral densities (BMDs), and (c) quantitative CT with dedicated three-dimensional analysis software to determine volumetric BMDs and geometric parameters (neck axis length, cortical thicknesses, volumes, and moments of inertia), and (d) quantitative CT-based FEM to calculate a numerical value of failure load. The 80 femurs were fractured via mechanical testing, with random assignment of one femur from each pair to the single-limb stance configuration (hereafter, stance configuration) and assignment of the paired femur to the sideways fall configuration (hereafter, side configuration). Descriptive statistics, univariate correlations, and stepwise regression models were obtained for each imaging method and for FEM to enable us to predict failure load in both configurations. Results Statistics reported are for stance and side configurations, respectively. For radiography, the strongest correlation with mechanical failure load was obtained by using a geometric parameter combined with a cortical thickness (r(2) = 0.66, P < .001; r(2) = 0.65, P < .001). For DXA, the strongest correlation with mechanical failure load was obtained by using total BMD (r(2) = 0.73, P < .001) and trochanteric BMD (r(2) = 0.80, P < .001). For quantitative CT, in both configurations, the best model combined volumetric BMD and a moment of inertia (r(2) = 0.78, P < .001; r(2) = 0.85, P < .001). FEM explained 87% (P < .001) and 83% (P < .001) of bone strength, respectively. By combining (a) radiography and DXA and (b) quantitative CT and DXA, correlations with mechanical failure load increased to 0.82 (P < .001) and 0.84 (P < .001), respectively, for radiography and DXA and to 0.80 (P < .001) and 0.86 (P < .001) , respectively, for quantitative CT and DXA. Conclusion Quantitative CT-based FEM was the best method with which to predict the experimental failure load; however, combining quantitative CT and DXA yielded a performance as good as that attained with FEM. The quantitative CT DXA combination may be easier to use in fracture prediction, provided standardized software is developed. These findings also highlight the major influence on femoral failure load, particularly in the trochanteric region, of a densitometric parameter combined with a geometric parameter. (©) RSNA, 2016 Online supplemental material is available for this article.
Phased-mission system analysis using Boolean algebraic methods
NASA Technical Reports Server (NTRS)
Somani, Arun K.; Trivedi, Kishor S.
1993-01-01
Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.
Multiscale Static Analysis of Notched and Unnotched Laminates Using the Generalized Method of Cells
NASA Technical Reports Server (NTRS)
Naghipour Ghezeljeh, Paria; Arnold, Steven M.; Pineda, Evan J.; Stier, Bertram; Hansen, Lucas; Bednarcyk, Brett A.; Waas, Anthony M.
2016-01-01
The generalized method of cells (GMC) is demonstrated to be a viable micromechanics tool for predicting the deformation and failure response of laminated composites, with and without notches, subjected to tensile and compressive static loading. Given the axial [0], transverse [90], and shear [+45/-45] response of a carbon/epoxy (IM7/977-3) system, the unnotched and notched behavior of three multidirectional layups (Layup 1: [0,45,90,-45](sub 2S), Layup 2: [0,60,0](sub 3S), and Layup 3: [30,60,90,-30, -60](sub 2S)) are predicted under both tensile and compressive static loading. Matrix nonlinearity is modeled in two ways. The first assumes all nonlinearity is due to anisotropic progressive damage of the matrix only, which is modeled, using the multiaxial mixed-mode continuum damage model (MMCDM) within GMC. The second utilizes matrix plasticity coupled with brittle final failure based on the maximum principle strain criteria to account for matrix nonlinearity and failure within the Finite Element Analysis--Micromechanics Analysis Code (FEAMAC) software multiscale framework. Both MMCDM and plasticity models incorporate brittle strain- and stress-based failure criteria for the fiber. Upon satisfaction of these criteria, the fiber properties are immediately reduced to a nominal value. The constitutive response for each constituent (fiber and matrix) is characterized using a combination of vendor data and the axial, transverse, and shear responses of unnotched laminates. Then, the capability of the multiscale methodology is assessed by performing blind predictions of the mentioned notched and unnotched composite laminates response under tensile and compressive loading. Tabulated data along with the detailed results (i.e., stress-strain curves as well as damage evolution states at various ratios of strain to failure) for all laminates are presented.
Failure modes and effects analysis for ocular brachytherapy.
Lee, Yongsook C; Kim, Yongbok; Huynh, Jason Wei-Yeong; Hamilton, Russell J
The aim of the study was to identify potential failure modes (FMs) having a high risk and to improve our current quality management (QM) program in Collaborative Ocular Melanoma Study (COMS) ocular brachytherapy by undertaking a failure modes and effects analysis (FMEA) and a fault tree analysis (FTA). Process mapping and FMEA were performed for COMS ocular brachytherapy. For all FMs identified in FMEA, risk priority numbers (RPNs) were determined by assigning and multiplying occurrence, severity, and lack of detectability values, each ranging from 1 to 10. FTA was performed for the major process that had the highest ranked FM. Twelve major processes, 121 sub-process steps, 188 potential FMs, and 209 possible causes were identified. For 188 FMs, RPN scores ranged from 1.0 to 236.1. The plaque assembly process had the highest ranked FM. The majority of FMs were attributable to human failure (85.6%), and medical physicist-related failures were the most numerous (58.9% of all causes). After FMEA, additional QM methods were included for the top 10 FMs and 6 FMs with severity values > 9.0. As a result, for these 16 FMs and the 5 major processes involved, quality control steps were increased from 8 (50%) to 15 (93.8%), and major processes having quality assurance steps were increased from 2 to 4. To reduce high risk in current clinical practice, we proposed QM methods. They mainly include a check or verification of procedures/steps and the use of checklists for both ophthalmology and radiation oncology staff, and intraoperative ultrasound-guided plaque positioning for ophthalmology staff. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Molecular cytogenetic analysis of Xq critical regions in premature ovarian failure
2013-01-01
Background One of the frequent reasons for unsuccessful conception is premature ovarian failure/primary ovarian insufficiency (POF/POI) that is defined as the loss of functional follicles below the age of 40 years. Among the genetic causes the most common one involves the X chromosome, as in Turner syndrome, partial X deletion and X-autosome translocations. Here we report a case of a 27-year-old female patient referred to genetic counselling because of premature ovarian failure. The aim of this case study to perform molecular genetic and cytogenetic analyses in order to identify the exact genetic background of the pathogenic phenotype. Results For premature ovarian failure disease diagnostics we performed the Fragile mental retardation 1 gene analysis using Southern blot technique and Repeat Primed PCR in order to identify the relationship between the Fragile mental retardation 1 gene premutation status and the premature ovarion failure disease. At this early onset, the premature ovarian failure affected patient we detected one normal allele of Fragile mental retardation 1 gene and we couldn’t verify the methylated allele, therefore we performed the cytogenetic analyses using G-banding and fluorescent in situ hybridization methods and a high resolution molecular cytogenetic method, the array comparative genomic hybridization technique. For this patient applying the G-banding, we identified a large deletion on the X chromosome at the critical region (ChrX q21.31-q28) which is associated with the premature ovarian failure phenotype. In order to detect the exact breakpoints, we used a special cytogenetic array ISCA plus CGH array and we verified a 67.355 Mb size loss at the critical region which include total 795 genes. Conclusions We conclude for this case study that the karyotyping is definitely helpful in the evaluation of premature ovarian failure patients, to identify the non submicroscopic chromosomal rearrangement, and using the array CGH technique we can contribute to the most efficient detection and mapping of exact deletion breakpoints of the deleted Xq region. PMID:24359613
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ciocca, Mario, E-mail: mario.ciocca@cnao.it; Cantone, Marie-Claire; Veronese, Ivan
2012-02-01
Purpose: Failure mode and effects analysis (FMEA) represents a prospective approach for risk assessment. A multidisciplinary working group of the Italian Association for Medical Physics applied FMEA to electron beam intraoperative radiation therapy (IORT) delivered using mobile linear accelerators, aiming at preventing accidental exposures to the patient. Methods and Materials: FMEA was applied to the IORT process, for the stages of the treatment delivery and verification, and consisted of three steps: 1) identification of the involved subprocesses; 2) identification and ranking of the potential failure modes, together with their causes and effects, using the risk probability number (RPN) scoring system,more » based on the product of three parameters (severity, frequency of occurrence and detectability, each ranging from 1 to 10); 3) identification of additional safety measures to be proposed for process quality and safety improvement. RPN upper threshold for little concern of risk was set at 125. Results: Twenty-four subprocesses were identified. Ten potential failure modes were found and scored, in terms of RPN, in the range of 42-216. The most critical failure modes consisted of internal shield misalignment, wrong Monitor Unit calculation and incorrect data entry at treatment console. Potential causes of failure included shield displacement, human errors, such as underestimation of CTV extension, mainly because of lack of adequate training and time pressures, failure in the communication between operators, and machine malfunctioning. The main effects of failure were represented by CTV underdose, wrong dose distribution and/or delivery, unintended normal tissue irradiation. As additional safety measures, the utilization of a dedicated staff for IORT, double-checking of MU calculation and data entry and finally implementation of in vivo dosimetry were suggested. Conclusions: FMEA appeared as a useful tool for prospective evaluation of patient safety in radiotherapy. The application of this method to IORT lead to identify three safety measures for risk mitigation.« less
ERIC Educational Resources Information Center
Seyyedrezaie, Zari Sadat; Ghonsooly, Behzad; Shahriari, Hesamoddin; Fatemi, Hazar Hosseini
2016-01-01
This study investigated the effect of writing process in Google Docs environment on Iranian EFL learners' writing performance. It also examined students' perceptions towards the effects of Google Docs and their perceived causes of success or failure in writing performance. In this regard, 48 EFL students were chosen based on their IELTs writing…
Statistical Models and Inference Procedures for Structural and Materials Reliability
1990-12-01
as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of
Application of MCT Failure Criterion using EFM
2010-03-26
because HELIUS:MCT™ does not facilitate this. Attempts have been made to use ABAQUS native thermal expansion model combined in addition to Helius-MCT... ABAQUS using a user defined element subroutine EFM. Comparisons have been made between the analysis results using EFM-MCT code and HELIUS:MCT™ code...using the Element-Failure Method (EFM) in ABAQUS . The EFM-MCT has been implemented in ABAQUS using a user defined element subroutine EFM. Comparisons
Strength determination of brittle materials as curved monolithic structures.
Hooi, P; Addison, O; Fleming, G J P
2014-04-01
The dental literature is replete with "crunch the crown" monotonic load-to-failure studies of all-ceramic materials despite fracture behavior being dominated by the indenter contact surface. Load-to-failure data provide no information on stress patterns, and comparisons among studies are impossible owing to variable testing protocols. We investigated the influence of nonplanar geometries on the maximum principal stress of curved discs tested in biaxial flexure in the absence of analytical solutions. Radii of curvature analogous to elements of complex dental geometries and a finite element analysis method were integrated with experimental testing as a surrogate solution to calculate the maximum principal stress at failure. We employed soda-lime glass discs, a planar control (group P, n = 20), with curvature applied to the remaining discs by slump forming to different radii of curvature (30, 20, 15, and 10 mm; groups R30-R10). The mean deflection (group P) and radii of curvature obtained on slumping (groups R30-R10) were determined by profilometry before and after annealing and surface treatment protocols. Finite element analysis used the biaxial flexure load-to-failure data to determine the maximum principal stress at failure. Mean maximum principal stresses and load to failure were analyzed with one-way analyses of variance and post hoc Tukey tests (α = 0.05). The measured radii of curvature differed significantly among groups, and the radii of curvature were not influenced by annealing. Significant increases in the mean load to failure were observed as the radius of curvature was reduced. The maximum principal stress did not demonstrate sensitivity to radius of curvature. The findings highlight the sensitivity of failure load to specimen shape. The data also support the synergistic use of bespoke computational analysis with conventional mechanical testing and highlight a solution to complications with complex specimen geometries.
Hively, Lee M.
2014-09-16
Data collected from devices and human condition may be used to forewarn of critical events such as machine/structural failure or events from brain/heart wave data stroke. By monitoring the data, and determining what values are indicative of a failure forewarning, one can provide adequate notice of the impending failure in order to take preventive measures. This disclosure teaches a computer-based method to convert dynamical numeric data representing physical objects (unstructured data) into discrete-phase-space states, and hence into a graph (structured data) for extraction of condition change.
The influence of the compression interface on the failure behavior and size effect of concrete
NASA Astrophysics Data System (ADS)
Kampmann, Raphael
The failure behavior of concrete materials is not completely understood because conventional test methods fail to assess the material response independent of the sample size and shape. To study the influence of strength and strain affecting test conditions, four typical concrete sample types were experimentally evaluated in uniaxial compression and analyzed for strength, deformational behavior, crack initiation/propagation, and fracture patterns under varying boundary conditions. Both low friction and conventional compression interfaces were assessed. High-speed video technology was used to monitor macrocracking. Inferential data analysis proved reliably lower strength results for reduced surface friction at the compression interfaces, regardless of sample shape. Reciprocal comparisons revealed statistically significant strength differences between most sample shapes. Crack initiation and propagation was found to differ for dissimilar compression interfaces. The principal stress and strain distributions were analyzed, and the strain domain was found to resemble the experimental results, whereas the stress analysis failed to explain failure for reduced end confinement. Neither stresses nor strains indicated strength reductions due to reduced friction, and therefore, buckling effects were considered. The high-speed video analysis revealed localize buckling phenomena, regardless of end confinement. Slender elements were the result of low friction, and stocky fragments developed under conventional confinement. The critical buckling load increased accordingly. The research showed that current test methods do not reflect the "true'' compressive strength and that concrete failure is strain driven. Ultimate collapse results from buckling preceded by unstable cracking.
NASA Astrophysics Data System (ADS)
Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying
2017-09-01
As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.
Stiffness and strength of fiber reinforced polymer composite bridge deck systems
NASA Astrophysics Data System (ADS)
Zhou, Aixi
This research investigates two principal characteristics that are of primary importance in Fiber Reinforced Polymer (FRP) bridge deck applications: STIFFNESS and STRENGTH. The research was undertaken by investigating the stiffness and strength characteristics of the multi-cellular FRP bridge deck systems consisting of pultruded FRP shapes. A systematic analysis procedure was developed for the stiffness analysis of multi-cellular FRP deck systems. This procedure uses the Method of Elastic Equivalence to model the cellular deck as an equivalent orthotropic plate. The procedure provides a practical method to predict the equivalent orthotropic plate properties of cellular FRP decks. Analytical solutions for the bending analysis of single span decks were developed using classical laminated plate theory. The analysis procedures can be extended to analyze continuous FRP decks. It can also be further developed using higher order plate theories. Several failure modes of the cellular FRP deck systems were recorded and analyzed through laboratory and field tests and Finite Element Analysis (FEA). Two schemes of loading patches were used in the laboratory test: a steel patch made according to the ASSHTO's bridge testing specifications; and a tire patch made from a real truck tire reinforced with silicon rubber. The tire patch was specially designed to simulate service loading conditions by modifying real contact loading from a tire. Our research shows that the effects of the stiffness and contact conditions of loading patches are significant in the stiffness and strength testing of FRP decks. Due to the localization of load, a simulated tire patch yields larger deflection than the steel patch under the same loading level. The tire patch produces significantly different failure compared to the steel patch: a local bending mode with less damage for the tire patch; and a local punching-shear mode for the steel patch. A deck failure function method is proposed for predicting the failure of FRP decks. Using developed laminated composite theories and FEA techniques, a strength analysis procedure containing ply-level information was proposed and detailed for FRP deck systems. The behavior of the deck's unsupported (free) edges was also investigated using ply-level FEA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohr, C.L.; Pankaskie, P.J.; Heasler, P.G.
Reactor fuel failure data sets in the form of initial power (P/sub i/), final power (P/sub f/), transient increase in power (..delta..P), and burnup (Bu) were obtained for pressurized heavy water reactors (PHWRs), boiling water reactors (BWRs), and pressurized water reactors (PWRs). These data sets were evaluated and used as the basis for developing two predictive fuel failure models, a graphical concept called the PCI-OGRAM, and a nonlinear regression based model called PROFIT. The PCI-OGRAM is an extension of the FUELOGRAM developed by AECL. It is based on a critical threshold concept for stress dependent stress corrosion cracking. The PROFITmore » model, developed at Pacific Northwest Laboratory, is the result of applying standard statistical regression methods to the available PCI fuel failure data and an analysis of the environmental and strain rate dependent stress-strain properties of the Zircaloy cladding.« less
Translation and validation of the Self-care of Heart Failure Index into Persian.
Siabani, Soraya; Leeder, Stephen R; Davidson, Patricia M; Najafi, Farid; Hamzeh, Behrooz; Solimani, Akram; Siahbani, Sara; Driscoll, Tim
2014-01-01
Chronic heart failure (CHF) is a common burdensome health problem worldwide. Self-care improves outcomes in patients with CHF. The Self-care of Heart Failure Index (SCHFI) is a well-known scale for assessing self-care. A reliable, valid, and culturally acceptable instrument is needed to develop and test self-care interventions in Iran. We sought to translate and validate the Persian version of SCHFI v 6.2 (pSCHFI). We translated the SCHFI into Persian (pSCHFI) using standardized methods. The reliability was evaluated by assessing Cronbach's α coefficient. Expert opinion, discussion with patients, and confirmatory factor analysis were used to assess face validity, content validity, and construct validity, respectively. The analysis, using 184 participants, showed acceptable internal consistency and construct validity for the 3 subscales of pSCHFI-self-care maintenance, self-care management, and self-care self-confidence. The pSCHFI is a valid instrument with an acceptable reliability for evaluating self-care in Persian patients with heart failure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aggarwal, R.K.; Litton, R.W.; Cornell, C.A.
1996-12-31
The performance of more than 3,000 offshore platforms in the Gulf of Mexico was observed during the passage of Hurricane Andrew in August 1992. This event provided an opportunity to test the procedures used for platform analysis and design. A global bias was inferred for overall platform capacity and loads in the Andrew Joint Industry Project (JIP) Phase 1. It was predicted that the pile foundations of several platforms should have failed, but did not. These results indicated that the biases specific to foundation failure modes may be higher than those of jacket failure modes. The biases in predictions ofmore » foundation failure modes were therefore investigated further in this study. The work included capacity analysis and calibration of predictions with the observed behavior for 3 jacket platforms and 3 caissons using Bayesian updating. Bias factors for two foundation failure modes, lateral shear and overturning, were determined for each structure. Foundation capacity estimates using conventional methods were found to be conservatively biased overall.« less
Spacecraft and propulsion technician error
NASA Astrophysics Data System (ADS)
Schultz, Daniel Clyde
Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.
Product Quality Improvement Using FMEA for Electric Parking Brake (EPB)
NASA Astrophysics Data System (ADS)
Dumitrescu, C. D.; Gruber, G. C.; Tişcă, I. A.
2016-08-01
One of the most frequently used methods to improve product quality is complex FMEA. (Failure Modes and Effects Analyses). In the literature various FMEA is known, depending on the mode and depending on the targets; we mention here some of these names: Failure Modes and Effects Analysis Process, or analysis Failure Mode and Effects Reported (FMECA). Whatever option is supported by the work team, the goal of the method is the same: optimize product design activities in research, design processes, implementation of manufacturing processes, optimization of mining product to beneficiaries. According to a market survey conducted on parts suppliers to vehicle manufacturers FMEA method is used in 75%. One purpose of the application is that after the research and product development is considered resolved, any errors which may be detected; another purpose of applying the method is initiating appropriate measures to avoid mistakes. Achieving these two goals leads to a high level distribution in applying, to avoid errors already in the design phase of the product, thereby avoiding the emergence and development of additional costs in later stages of product manufacturing. During application of FMEA method using standardized forms; with their help will establish the initial assemblies of product structure, in which all components will be viewed without error. The work is an application of the method FMEA quality components to optimize the structure of the electrical parking brake (Electric Parching Brake - E.P.B). This is a component attached to the roller system which ensures automotive replacement of conventional mechanical parking brake while ensuring its comfort, functionality, durability and saves space in the passenger compartment. The paper describes the levels at which they appealed in applying FMEA, working arrangements in the 4 distinct levels of analysis, and how to determine the number of risk (Risk Priority Number); the analysis of risk factors and established authors who have imposed measures to reduce / eliminate risk completely exploiting this complex product.
NASA Technical Reports Server (NTRS)
Packard, Michael H.
2002-01-01
Probabilistic Structural Analysis (PSA) is now commonly used for predicting the distribution of time/cycles to failure of turbine blades and other engine components. These distributions are typically based on fatigue/fracture and creep failure modes of these components. Additionally, reliability analysis is used for taking test data related to particular failure modes and calculating failure rate distributions of electronic and electromechanical components. How can these individual failure time distributions of structural, electronic and electromechanical component failure modes be effectively combined into a top level model for overall system evaluation of component upgrades, changes in maintenance intervals, or line replaceable unit (LRU) redesign? This paper shows an example of how various probabilistic failure predictions for turbine engine components can be evaluated and combined to show their effect on overall engine performance. A generic model of a turbofan engine was modeled using various Probabilistic Risk Assessment (PRA) tools (Quantitative Risk Assessment Software (QRAS) etc.). Hypothetical PSA results for a number of structural components along with mitigation factors that would restrict the failure mode from propagating to a Loss of Mission (LOM) failure were used in the models. The output of this program includes an overall failure distribution for LOM of the system. The rank and contribution to the overall Mission Success (MS) is also given for each failure mode and each subsystem. This application methodology demonstrates the effectiveness of PRA for assessing the performance of large turbine engines. Additionally, the effects of system changes and upgrades, the application of different maintenance intervals, inclusion of new sensor detection of faults and other upgrades were evaluated in determining overall turbine engine reliability.
NASA Technical Reports Server (NTRS)
Brinson, R. F.
1985-01-01
A method for lifetime or durability predictions for laminated fiber reinforced plastics is given. The procedure is similar to but not the same as the well known time-temperature-superposition principle for polymers. The method is better described as an analytical adaptation of time-stress-super-position methods. The analytical constitutive modeling is based upon a nonlinear viscoelastic constitutive model developed by Schapery. Time dependent failure models are discussed and are related to the constitutive models. Finally, results of an incremental lamination analysis using the constitutive and failure model are compared to experimental results. Favorable results between theory and predictions are presented using data from creep tests of about two months duration.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Comparison between four dissimilar solar panel configurations
NASA Astrophysics Data System (ADS)
Suleiman, K.; Ali, U. A.; Yusuf, Ibrahim; Koko, A. D.; Bala, S. I.
2017-12-01
Several studies on photovoltaic systems focused on how it operates and energy required in operating it. Little attention is paid on its configurations, modeling of mean time to system failure, availability, cost benefit and comparisons of parallel and series-parallel designs. In this research work, four system configurations were studied. Configuration I consists of two sub-components arranged in parallel with 24 V each, configuration II consists of four sub-components arranged logically in parallel with 12 V each, configuration III consists of four sub-components arranged in series-parallel with 8 V each, and configuration IV has six sub-components with 6 V each arranged in series-parallel. Comparative analysis was made using Chapman Kolmogorov's method. The derivation for explicit expression of mean time to system failure, steady state availability and cost benefit analysis were performed, based on the comparison. Ranking method was used to determine the optimal configuration of the systems. The results of analytical and numerical solutions of system availability and mean time to system failure were determined and it was found that configuration I is the optimal configuration.
Structural reliability analysis of laminated CMC components
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.; Gyekenyesi, John P.
1991-01-01
For laminated ceramic matrix composite (CMC) materials to realize their full potential in aerospace applications, design methods and protocols are a necessity. The time independent failure response of these materials is focussed on and a reliability analysis is presented associated with the initiation of matrix cracking. A public domain computer algorithm is highlighted that was coupled with the laminate analysis of a finite element code and which serves as a design aid to analyze structural components made from laminated CMC materials. Issues relevant to the effect of the size of the component are discussed, and a parameter estimation procedure is presented. The estimation procedure allows three parameters to be calculated from a failure population that has an underlying Weibull distribution.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
An Approach for Reducing the Error Rate in Automated Lung Segmentation
Gill, Gurman; Beichel, Reinhard R.
2016-01-01
Robust lung segmentation is challenging, especially when tens of thousands of lung CT scans need to be processed, as required by large multi-center studies. The goal of this work was to develop and assess a method for the fusion of segmentation results from two different methods to generate lung segmentations that have a lower failure rate than individual input segmentations. As basis for the fusion approach, lung segmentations generated with a region growing and model-based approach were utilized. The fusion result was generated by comparing input segmentations and selectively combining them using a trained classification system. The method was evaluated on a diverse set of 204 CT scans of normal and diseased lungs. The fusion approach resulted in a Dice coefficient of 0.9855 ± 0.0106 and showed a statistically significant improvement compared to both input segmentation methods. In addition, the failure rate at different segmentation accuracy levels was assessed. For example, when requiring that lung segmentations must have a Dice coefficient of better than 0.97, the fusion approach had a failure rate of 6.13%. In contrast, the failure rate for region growing and model-based methods was 18.14% and 15.69%, respectively. Therefore, the proposed method improves the quality of the lung segmentations, which is important for subsequent quantitative analysis of lungs. Also, to enable a comparison with other methods, results on the LOLA11 challenge test set are reported. PMID:27447897
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
"If at first you don't succeed": using failure to improve teaching.
Pinsky, L E; Irby, D M
1997-11-01
The authors surveyed a group of distinguished clinical teachers regarding episodes of failure that had subsequently led to improvements in their teaching. Specifically, they examined how these teachers had used reflection on failed approaches as a tool for experiential learning. The respondents believed that failures were as important as successes in learning to be a good teacher. Using qualitative content analysis of the respondents' comments, the authors identified eight common types of failure associated with each of the three phases of teaching: planning, teaching, and reflection. Common failures associated with the planning stage were misjudging learners, lack of preparation, presenting too much content, lack of purpose, and difficulties with audiovisuals. The primary failure associated with actual teaching was inflexibly using a single teaching method. In the reflection phase, respondents said they most often realized that they had made one of two common errors: selecting the wrong teaching strategy or incorrectly implementing a sound strategy. For each identified failure, the respondents made recommendations for improvement. The deliberative process that had guided planning, teaching, and reflecting had helped all of them transform past failures into successes.
Baldewijns, Karolien; Bektas, Sema; Boyne, Josiane; Rohde, Carla; De Maesschalck, Lieven; De Bleser, Leentje; Brandenburg, Vincent; Knackstedt, Christian; Devillé, Aleidis; Sanders-Van Wijk, Sandra; Brunner La Rocca, Hans-Peter
2017-12-01
Heart failure is a complex disease with poor outcome. This complexity may prevent care providers from covering all aspects of care. This could not only be relevant for individual patient care, but also for care organisation. Disease management programmes applying a multidisciplinary approach are recommended to improve heart failure care. However, there is a scarcity of research considering how disease management programme perform, in what form they should be offered, and what care and support patients and care providers would benefit most. Therefore, the Improving kNowledge Transfer to Efficaciously Raise the level of Contemporary Treatment in Heart Failure (INTERACT-in-HF) study aims to explore the current processes of heart failure care and to identify factors that may facilitate and factors that may hamper heart failure care and guideline adherence. Within a cross-sectional mixed method design in three regions of the North-West part of Europe, patients (n = 88) and their care providers (n = 59) were interviewed. Prior to the in-depth interviews, patients were asked to complete three questionnaires: The Dutch Heart Failure Knowledge scale, The European Heart Failure Self-care Behaviour Scale and The global health status and social economic status. In parallel, retrospective data based on records from these (n = 88) and additional patients (n = 82) are reviewed. All interviews were audiotaped and transcribed verbatim for analysis.
Boyne, Josiane; Rohde, Carla; De Maesschalck, Lieven; De Bleser, Leentje; Brandenburg, Vincent; Knackstedt, Christian; Devillé, Aleidis; Sanders-Van Wijk, Sandra; Brunner La Rocca, Hans-Peter
2017-01-01
Heart failure is a complex disease with poor outcome. This complexity may prevent care providers from covering all aspects of care. This could not only be relevant for individual patient care, but also for care organisation. Disease management programmes applying a multidisciplinary approach are recommended to improve heart failure care. However, there is a scarcity of research considering how disease management programme perform, in what form they should be offered, and what care and support patients and care providers would benefit most. Therefore, the Improving kNowledge Transfer to Efficaciously Raise the level of Contemporary Treatment in Heart Failure (INTERACT-in-HF) study aims to explore the current processes of heart failure care and to identify factors that may facilitate and factors that may hamper heart failure care and guideline adherence. Within a cross-sectional mixed method design in three regions of the North-West part of Europe, patients (n = 88) and their care providers (n = 59) were interviewed. Prior to the in-depth interviews, patients were asked to complete three questionnaires: The Dutch Heart Failure Knowledge scale, The European Heart Failure Self-care Behaviour Scale and The global health status and social economic status. In parallel, retrospective data based on records from these (n = 88) and additional patients (n = 82) are reviewed. All interviews were audiotaped and transcribed verbatim for analysis. PMID:29472989
Rezaei, Fatemeh; Yarmohammadian, Mohmmad H.; Haghshenas, Abbas; Fallah, Ali; Ferdosi, Masoud
2018-01-01
Background: Methodology of Failure Mode and Effects Analysis (FMEA) is known as an important risk assessment tool and accreditation requirement by many organizations. For prioritizing failures, the index of “risk priority number (RPN)” is used, especially for its ease and subjective evaluations of occurrence, the severity and the detectability of each failure. In this study, we have tried to apply FMEA model more compatible with health-care systems by redefining RPN index to be closer to reality. Methods: We used a quantitative and qualitative approach in this research. In the qualitative domain, focused groups discussion was used to collect data. A quantitative approach was used to calculate RPN score. Results: We have studied patient's journey in surgery ward from holding area to the operating room. The highest priority failures determined based on (1) defining inclusion criteria as severity of incident (clinical effect, claim consequence, waste of time and financial loss), occurrence of incident (time - unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) then, (2) risks priority criteria quantified by using RPN index (361 for the highest rate failure). The ability of improved RPN scores reassessed by root cause analysis showed some variations. Conclusions: We concluded that standard criteria should be developed inconsistent with clinical linguistic and special scientific fields. Therefore, cooperation and partnership of technical and clinical groups are necessary to modify these models. PMID:29441184
Baptista Macaroff, W M; Castroman Espasandín, P
2007-01-01
The aim of this study was to assess the cumulative sum (cusum) method for evaluating the performance of our hospital's acute postoperative pain service. The period of analysis was 7 months. Analgesic failure was defined as a score of 3 points or more on a simple numerical scale. Acceptable failure (p0) was set at 20% of patients upon admission to the postanesthetic recovery unit and at 7% 24 hours after surgery. Unacceptable failure was set at double the p0 rate at each time (40% and 14%, respectively). The unit's patient records were used to generate a cusum graph for each evaluation. Nine hundred four records were included. The rate of failure was 31.6% upon admission to the unit and 12.1% at the 24-hour postoperative assessment. The curve rose rapidly to the value set for p0 at both evaluation times (n = 14 and n = 17, respectively), later leveled off, and began to fall after 721 and 521 cases, respectively. Our study shows the efficacy of the cusum method for monitoring a proposed quality standard. The graph also showed periods of suboptimal performance that would not have been evident from analyzing the data en block. Thus the cusum method would facilitate rapid detection of periods in which quality declines.
Matasci, Battista; Stock, Greg M.; Jaboyedoff, Michael; Carrea, Dario; Collins, Brian D.; Guérin, Antoine; Matasci, G.; Ravanel, L.
2018-01-01
Rockfalls strongly influence the evolution of steep rocky landscapes and represent a significant hazard in mountainous areas. Defining the most probable future rockfall source areas is of primary importance for both geomorphological investigations and hazard assessment. Thus, a need exists to understand which areas of a steep cliff are more likely to be affected by a rockfall. An important analytical gap exists between regional rockfall susceptibility studies and block-specific geomechanical calculations. Here we present methods for quantifying rockfall susceptibility at the cliff scale, which is suitable for sub-regional hazard assessment (hundreds to thousands of square meters). Our methods use three-dimensional point clouds acquired by terrestrial laser scanning to quantify the fracture patterns and compute failure mechanisms for planar, wedge, and toppling failures on vertical and overhanging rock walls. As a part of this work, we developed a rockfall susceptibility index for each type of failure mechanism according to the interaction between the discontinuities and the local cliff orientation. The susceptibility for slope parallel exfoliation-type failures, which are generally hard to identify, is partly captured by planar and toppling susceptibility indexes. We tested the methods for detecting the most susceptible rockfall source areas on two famously steep landscapes, Yosemite Valley (California, USA) and the Drus in the Mont-Blanc massif (France). Our rockfall susceptibility models show good correspondence with active rockfall sources. The methods offer new tools for investigating rockfall hazard and improving our understanding of rockfall processes.
Peng, D; Wang, S P; Zhao, D H; Fan, Q C; Shu, J; Liu, J H
2018-05-08
Objective: To explore the effect of hyperuricemia on prognosis in patients with heart failure of coronary heart disease (CHD) after revascularization. Methods: A single-center retrospective study of all subjects who underwent percutaneous coronary intervention (PCI) or coronary artery bypass grafting (CABG) as revascularization for CHD at Beijing Anzhen Hospital, Capital Medical University, between January 2005 and December 2014 was performed.Patients were divided into two groups by with or without hyperuricemia.The average follow-up was 1 818 d. Results: The Logistic regression analysis revealed that hyperuricemia was independent risk factors of readmission of heart failure( P =0.018, OR =1.499, 95% CI 1.071-2.098). The Cox regression analysis revealed that hyperuricemia was independent risk factor of all-cause mortality( P =0.002, RR =1.520, 95% CI 1.166-1.982), cardiovascular ( CV ) mortality( P =0.001, RR =1.811, 95% CI 1.279-2.566), heart failure mortality( P =0.006, RR =2.151, 95% CI 1.247-3.711). Conclusions: There is negative correlation between level of uric acid and left ventricular ejection fraction (LVEF). The patients with heart failure of coronary heart disease complicated with hyperuricemia have high risk of readmission of heart failure, all-cause mortality, CV mortality andheart failure mortality than patients with normal uric acid level. Hyperuricemia is an independent risk factor for patients with heart failure of coronary heart disease after revascularization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younge, Kelly Cooper, E-mail: kyounge@med.umich.edu; Wang, Yizhen; Thompson, John
2015-04-01
Purpose: To improve the safety and efficiency of a new stereotactic radiosurgery program with the application of failure mode and effects analysis (FMEA) performed by a multidisciplinary team of health care professionals. Methods and Materials: Representatives included physicists, therapists, dosimetrists, oncologists, and administrators. A detailed process tree was created from an initial high-level process tree to facilitate the identification of possible failure modes. Group members were asked to determine failure modes that they considered to be the highest risk before scoring failure modes. Risk priority numbers (RPNs) were determined by each group member individually and then averaged. Results: A totalmore » of 99 failure modes were identified. The 5 failure modes with an RPN above 150 were further analyzed to attempt to reduce these RPNs. Only 1 of the initial items that the group presumed to be high-risk (magnetic resonance imaging laterality reversed) was ranked in these top 5 items. New process controls were put in place to reduce the severity, occurrence, and detectability scores for all of the top 5 failure modes. Conclusions: FMEA is a valuable team activity that can assist in the creation or restructuring of a quality assurance program with the aim of improved safety, quality, and efficiency. Performing the FMEA helped group members to see how they fit into the bigger picture of the program, and it served to reduce biases and preconceived notions about which elements of the program were the riskiest.« less
Commercial transport aircraft composite structures
NASA Technical Reports Server (NTRS)
Mccarty, J. E.
1983-01-01
The role that analysis plays in the development, production, and substantiation of aircraft structures is discussed. The types, elements, and applications of failure that are used and needed; the current application of analysis methods to commercial aircraft advanced composite structures, along with a projection of future needs; and some personal thoughts on analysis development goals and the elements of an approach to analysis development are discussed.
Kilburn, Jeremy M.; Lester, Scott C.; Lucas, John T.; Soike, Michael H.; Blackstock, A. William; Kearns, William T.; Hinson, William H.; Miller, Antonius A.; Petty, William J.; Munley, Michael T.; Urbanic, James J.
2014-01-01
Purpose/Objective(s) Regional failures occur in up to 15% of patients treated with stereotactic body radiotherapy (SBRT) for stage I/II lung cancer. This report focuses on the management of the unique scenario of isolated regional failures. Methods Patients treated initially with SBRT or accelerated hypo-fractionated radiotherapy were screened for curative intent treatment of isolated mediastinal failures (IMFs). Local control, regional control, progression-free survival, and distant control were estimated from the date of salvage treatment using the Kaplan–Meier method. Results Among 160 patients treated from 2002 to 2012, 12 suffered IMF and were amenable to salvage treatment. The median interval between treatments was 16 months (2–57 mo). Median salvage dose was 66 Gy (60–70 Gy). With a median follow-up of 10 months, the median overall survival was 15 months (95% confidence interval, 5.8–37 mo). When estimated from original treatment, the median overall survival was 38 months (95% confidence interval, 17–71 mo). No subsequent regional failures occurred. Distant failure was the predominant mode of relapse following salvage for IMF with a 2-year distant control rate of 38%. At the time of this analysis, three patients have died without recurrence while four are alive and no evidence of disease. High-grade toxicity was uncommon. Conclusions To our knowledge, this is first analysis of salvage mediastinal radiation after SBRT or accelerated hypofractionated radiotherapy in lung cancer. Outcomes appear similar to stage III disease at presentation. Distant failures were common, suggesting a role for concurrent or sequential chemotherapy. A standard full course of external beam radiotherapy is advisable in this unique clinical scenario. PMID:24736084
Holbrook, Christopher M.; Perry, Russell W.; Brandes, Patricia L.; Adams, Noah S.
2013-01-01
In telemetry studies, premature tag failure causes negative bias in fish survival estimates because tag failure is interpreted as fish mortality. We used mark-recapture modeling to adjust estimates of fish survival for a previous study where premature tag failure was documented. High rates of tag failure occurred during the Vernalis Adaptive Management Plan’s (VAMP) 2008 study to estimate survival of fall-run Chinook salmon (Oncorhynchus tshawytscha) during migration through the San Joaquin River and Sacramento-San Joaquin Delta, California. Due to a high rate of tag failure, the observed travel time distribution was likely negatively biased, resulting in an underestimate of tag survival probability in this study. Consequently, the bias-adjustment method resulted in only a small increase in estimated fish survival when the observed travel time distribution was used to estimate the probability of tag survival. Since the bias-adjustment failed to remove bias, we used historical travel time data and conducted a sensitivity analysis to examine how fish survival might have varied across a range of tag survival probabilities. Our analysis suggested that fish survival estimates were low (95% confidence bounds range from 0.052 to 0.227) over a wide range of plausible tag survival probabilities (0.48–1.00), and this finding is consistent with other studies in this system. When tags fail at a high rate, available methods to adjust for the bias may perform poorly. Our example highlights the importance of evaluating the tag life assumption during survival studies, and presents a simple framework for evaluating adjusted survival estimates when auxiliary travel time data are available.
Human versus automation in responding to failures: an expected-value analysis
NASA Technical Reports Server (NTRS)
Sheridan, T. B.; Parasuraman, R.
2000-01-01
A simple analytical criterion is provided for deciding whether a human or automation is best for a failure detection task. The method is based on expected-value decision theory in much the same way as is signal detection. It requires specification of the probabilities of misses (false negatives) and false alarms (false positives) for both human and automation being considered, as well as factors independent of the choice--namely, costs and benefits of incorrect and correct decisions as well as the prior probability of failure. The method can also serve as a basis for comparing different modes of automation. Some limiting cases of application are discussed, as are some decision criteria other than expected value. Actual or potential applications include the design and evaluation of any system in which either humans or automation are being considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Chen, Frank Xiaoxiao; Pebay, Philippe Pierre
2010-06-01
Effective failure prediction and mitigation strategies in high-performance computing systems could provide huge gains in resilience of tightly coupled large-scale scientific codes. These gains would come from prediction-directed process migration and resource servicing, intelligent resource allocation, and checkpointing driven by failure predictors rather than at regular intervals based on nominal mean time to failure. Given probabilistic associations of outlier behavior in hardware-related metrics with eventual failure in hardware, system software, and/or applications, this paper explores approaches for quantifying the effects of prediction and mitigation strategies and demonstrates these using actual production system data. We describe context-relevant methodologies for determining themore » accuracy and cost-benefit of predictors. While many research studies have quantified the expected impact of growing system size, and the associated shortened mean time to failure (MTTF), on application performance in large-scale high-performance computing (HPC) platforms, there has been little if any work to quantify the possible gains from predicting system resource failures with significant but imperfect accuracy. This possibly stems from HPC system complexity and the fact that, to date, no one has established any good predictors of failure in these systems. Our work in the OVIS project aims to discover these predictors via a variety of data collection techniques and statistical analysis methods that yield probabilistic predictions. The question then is, 'How good or useful are these predictions?' We investigate methods for answering this question in a general setting, and illustrate them using a specific failure predictor discovered on a production system at Sandia.« less
A Critical Analysis of the Conventionally Employed Creep Lifing Methods
Abdallah, Zakaria; Gray, Veronica; Whittaker, Mark; Perkins, Karen
2014-01-01
The deformation of structural alloys presents problems for power plants and aerospace applications due to the demand for elevated temperatures for higher efficiencies and reductions in greenhouse gas emissions. The materials used in such applications experience harsh environments which may lead to deformation and failure of critical components. To avoid such catastrophic failures and also increase efficiency, future designs must utilise novel/improved alloy systems with enhanced temperature capability. In recognising this issue, a detailed understanding of creep is essential for the success of these designs by ensuring components do not experience excessive deformation which may ultimately lead to failure. To achieve this, a variety of parametric methods have been developed to quantify creep and creep fracture in high temperature applications. This study reviews a number of well-known traditionally employed creep lifing methods with some more recent approaches also included. The first section of this paper focuses on predicting the long-term creep rupture properties which is an area of interest for the power generation sector. The second section looks at pre-defined strains and the re-production of full creep curves based on available data which is pertinent to the aerospace industry where components are replaced before failure. PMID:28788623
Reliability computation using fault tree analysis
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.
2013-01-01
Background A multidisciplinary and multi-institutional working group applied the Failure Mode and Effects Analysis (FMEA) approach to the actively scanned proton beam radiotherapy process implemented at CNAO (Centro Nazionale di Adroterapia Oncologica), aiming at preventing accidental exposures to the patient. Methods FMEA was applied to the treatment planning stage and consisted of three steps: i) identification of the involved sub-processes; ii) identification and ranking of the potential failure modes, together with their causes and effects, using the risk probability number (RPN) scoring system, iii) identification of additional safety measures to be proposed for process quality and safety improvement. RPN upper threshold for little concern of risk was set at 125. Results Thirty-four sub-processes were identified, twenty-two of them were judged to be potentially prone to one or more failure modes. A total of forty-four failure modes were recognized, 52% of them characterized by an RPN score equal to 80 or higher. The threshold of 125 for RPN was exceeded in five cases only. The most critical sub-process appeared related to the delineation and correction of artefacts in planning CT data. Failures associated to that sub-process were inaccurate delineation of the artefacts and incorrect proton stopping power assignment to body regions. Other significant failure modes consisted of an outdated representation of the patient anatomy, an improper selection of beam direction and of the physical beam model or dose calculation grid. The main effects of these failures were represented by wrong dose distribution (i.e. deviating from the planned one) delivered to the patient. Additional strategies for risk mitigation, easily and immediately applicable, consisted of a systematic information collection about any known implanted prosthesis directly from each patient and enforcing a short interval time between CT scan and treatment start. Moreover, (i) the investigation of dedicated CT image reconstruction algorithms, (ii) further evaluation of treatment plan robustness and (iii) implementation of independent methods for dose calculation (such as Monte Carlo simulations) may represent novel solutions to increase patient safety. Conclusions FMEA is a useful tool for prospective evaluation of patient safety in proton beam radiotherapy. The application of this method to the treatment planning stage lead to identify strategies for risk mitigation in addition to the safety measures already adopted in clinical practice. PMID:23705626
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Chao Chung, E-mail: ho919@pchome.com.tw; Liao, Ching-Jong
Highlights: > This study is based on a real case in a regional teaching hospital in Taiwan. > We use Failure mode and effects analysis (FMEA) as the evaluation method. > We successfully identify the risk factors of infectious waste disposal. > We propose plans for the detection of exceptional cases of infectious waste. - Abstract: In recent times, the quality of medical care has been continuously improving in medical institutions wherein patient-centred care has been emphasized. Failure mode and effects analysis (FMEA) has also been promoted as a method of basic risk management and as part of total qualitymore » management (TQM) for improving the quality of medical care and preventing mistakes. Therefore, a study was conducted using FMEA to evaluate the potential risk causes in the process of infectious medical waste disposal, devise standard procedures concerning the waste, and propose feasible plans for facilitating the detection of exceptional cases of infectious waste. The analysis revealed the following results regarding medical institutions: (a) FMEA can be used to identify the risk factors of infectious waste disposal. (b) During the infectious waste disposal process, six items were scored over 100 in the assessment of uncontrolled risks: erroneous discarding of infectious waste by patients and their families, erroneous discarding by nursing staff, erroneous discarding by medical staff, cleaning drivers pierced by sharp articles, cleaning staff pierced by sharp articles, and unmarked output units. Therefore, the study concluded that it was necessary to (1) provide education and training about waste classification to the medical staff, patients and their families, nursing staff, and cleaning staff; (2) clarify the signs of caution; and (3) evaluate the failure mode and strengthen the effects.« less
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
Design Criteria for X-CRV Honeycomb Panels: A Preliminary Study
NASA Technical Reports Server (NTRS)
Caccese, Vincent; Verinder, Irene
1997-01-01
The objective of this project is to perform the first step in developing structural design criteria for composite sandwich panels that are to be used in the aeroshell of the crew return vehicle (X-CRV). The preliminary concept includes a simplified method for assessing the allowable strength in the laminate material. Ultimately, it is intended that the design criteria be extended to address the global response of the vehicle. This task will require execution of a test program as outlined in the recommendation section of this report. The aeroshell of the X-CRV is comprised of composite sandwich panels consisting of fiberite face sheets and a phenolic honeycomb core. The function of the crew return vehicle is to enable the safe return of injured or ill crewpersons from space station, the evacuation of crew in case of emergency or the return of crew if an orbiter is not available. A significant objective of the X-CRV project is to demonstrate that this vehicle can be designed, built and operated at lower cost and at a significantly faster development time. Development time can be reduced by driving out issues in both structural design and manufacturing concurrently. This means that structural design and analysis progresses in conjunction with manufacturing and testing. Preliminary tests results on laminate coupons are presented in the report. Based on these results a method for detection material failure in the material is presented. In the long term, extrapolation of coupon data to large scale structures may be inadequate. Test coupons used to develop failure criteria at the material scale are typically small when compared to the overall structure. Their inherent small size indicates that the material failure criteria can be used to predict localized failure of the structure, however, it can not be used to predict failure for all failure modes. Some failure modes occur only when the structure or one of its sub-components are studied as a whole. Conversely, localized failure may not indicate failure of the structure as a whole and the amount of reserve capacity, if any, should be assessed. To develop a complete design criteria experimental studies of the sandwich panel are needed. Only then can a conservative and accurate design criteria be developed. This criteria should include effects of flaws and defects, and environmental factors such as temperature and moisture. Preliminary results presented in this report suggest that a simplified analysis can be used to predict the strength of a laminate. Testing for environmental effects have yet to be included in this work. The so called 'rogue flaw test' appears to be a promising method for assessing the effect of a defect in a laminate. This method fits in quite well with the philosophy of achieving a damage tolerant design.
Live load test and failure analysis for the steel deck truss bridge over the New River in Virginia.
DOT National Transportation Integrated Search
2009-01-01
This report presents the methods used to model a steel deck truss bridge over the New River in Hillsville, Virginia. These methods were evaluated by comparing analytical results with data recorded from 14 members during live load testing. The researc...
Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements
NASA Technical Reports Server (NTRS)
Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.
1988-01-01
The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.
A new method of converter transformer protection without commutation failure
NASA Astrophysics Data System (ADS)
Zhang, Jiayu; Kong, Bo; Liu, Mingchang; Zhang, Jun; Guo, Jianhong; Jing, Xu
2018-01-01
With the development of AC / DC hybrid transmission technology, converter transformer as nodes of AC and DC conversion of HVDC transmission technology, its reliable safe and stable operation plays an important role in the DC transmission. As a common problem of DC transmission, commutation failure poses a serious threat to the safe and stable operation of power grid. According to the commutation relation between the AC bus voltage of converter station and the output DC voltage of converter, the generalized transformation ratio is defined, and a new method of converter transformer protection based on generalized transformation ratio is put forward. The method uses generalized ratio to realize the on-line monitoring of the fault or abnormal commutation components, and the use of valve side of converter transformer bushing CT current characteristics of converter transformer fault accurately, and is not influenced by the presence of commutation failure. Through the fault analysis and EMTDC/PSCAD simulation, the protection can be operated correctly under the condition of various faults of the converter.
Investigation of fatigue crack growth in acrylic bone cement using the acoustic emission technique.
Roques, A; Browne, M; Thompson, J; Rowland, C; Taylor, A
2004-02-01
Failure of the bone cement mantle has been implicated in the loosening process of cemented hip stems. Current methods of investigating degradation of the cement mantle in vitro often require sectioning of the sample to confirm failure paths. The present research investigates acoustic emission as a passive experimental method for the assessment of bone cement failure. Damage in bone cement was monitored during four point bending fatigue tests through an analysis of the peak amplitude, duration, rise time (RT) and energy of the events emitted from the damage sections. A difference in AE trends was observed during failure for specimens aged and tested in (i) air and (ii) Ringer's solution at 37 degrees C. It was noted that the acoustic behaviour varied according to applied load level; events of higher duration and RT were emitted during fatigue at lower stresses. A good correlation was observed between crack location and source of acoustic emission, and the nature of the acoustic parameters that were most suited to bone cement failure characterisation was identified. The methodology employed in this study could potentially be used as a pre-clinical assessment tool for the integrity of cemented load bearing implants.
Subcritical crack growth in SiNx thin-film barriers studied by electro-mechanical two-point bending
NASA Astrophysics Data System (ADS)
Guan, Qingling; Laven, Jozua; Bouten, Piet C. P.; de With, Gijsbertus
2013-06-01
Mechanical failure resulting from subcritical crack growth in the SiNx inorganic barrier layer applied on a flexible multilayer structure was studied by an electro-mechanical two-point bending method. A 10 nm conducting tin-doped indium oxide layer was sputtered as an electrical probe to monitor the subcritical crack growth in the 150 nm dielectric SiNx layer carried by a polyethylene naphthalate substrate. In the electro-mechanical two-point bending test, dynamic and static loads were applied to investigate the crack propagation in the barrier layer. As consequence of using two loading modes, the characteristic failure strain and failure time could be determined. The failure probability distribution of strain and lifetime under each loading condition was described by Weibull statistics. In this study, results from the tests in dynamic and static loading modes were linked by a power law description to determine the critical failure over a range of conditions. The fatigue parameter n from the power law reduces greatly from 70 to 31 upon correcting for internal strain. The testing method and analysis tool as described in the paper can be used to understand the limit of thin-film barriers in terms of their mechanical properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vile, D; Zhang, L; Cuttino, L
2016-06-15
Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiCostanzo, D; Ayan, A; Woollard, J
Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, softwaremore » was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.« less
Model-Based Method for Sensor Validation
NASA Technical Reports Server (NTRS)
Vatan, Farrokh
2012-01-01
Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).
Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Min, J. B.; Xue, D.; Shi, Y.
2013-01-01
A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.
Control methods for aiding a pilot during STOL engine failure transients
NASA Technical Reports Server (NTRS)
Nelson, E. R.; Debra, D. B.
1976-01-01
Candidate autopilot control laws that control the engine failure transient sink rates by demonstrating the engineering application of modern state variable control theory were defined. The results of approximate modal analysis were compared to those derived from full state analyses provided from computer design solutions. The aircraft was described, and a state variable model of its longitudinal dynamic motion due to engine and control variations was defined. The classical fast and slow modes were assumed to be sufficiently different to define reduced order approximations of the aircraft motion amendable to hand analysis control definition methods. The original state equations of motion were also applied to a large scale state variable control design program, in particular OPTSYS. The resulting control laws were compared with respect to their relative responses, ease of application, and meeting the desired performance objectives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, M.H.; Coon, D.M.
Time-dependent failure at elevated temperatures currently governs the service life of oxynitride glass-joined silicon nitride. Creep, devitrification, stress- aided oxidation-controlled slow crack growth, and viscous cabitation-controlled failure are examined as possible controlling mechanisms. Creep deformation failure is observed above 1000{degrees}C. Fractographic evidence indicates cavity formation and growth below 1000{degrees}C. Auger electron spectroscopy verified that the oxidation rate of the joining glass is governed by the oxygen supply rate. Time-to-failure data and those predicted using the Tsai and Raj, and Raj and Dang viscous cavitation models. It is concluded that viscous relaxation and isolated cavity growth control the rate of failuremore » in oxynitride glass-filled silicon nitride joints below 1000{degrees}C. Several possible methods are also proposed for increasing the service lives of these joints.« less
PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD
NASA Astrophysics Data System (ADS)
Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao
Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.
Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip
NASA Astrophysics Data System (ADS)
Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang
2016-09-01
Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.
NASA Astrophysics Data System (ADS)
Zhang, Ding; Zhang, Yingjie
2017-09-01
A framework for reliability and maintenance analysis of job shop manufacturing systems is proposed in this paper. An efficient preventive maintenance (PM) policy in terms of failure effects analysis (FEA) is proposed. Subsequently, reliability evaluation and component importance measure based on FEA are performed under the PM policy. A job shop manufacturing system is applied to validate the reliability evaluation and dynamic maintenance policy. Obtained results are compared with existed methods and the effectiveness is validated. Some vague understandings for issues such as network modelling, vulnerabilities identification, the evaluation criteria of repairable systems, as well as PM policy during manufacturing system reliability analysis are elaborated. This framework can help for reliability optimisation and rational maintenance resources allocation of job shop manufacturing systems.
Hao, Shengwang; Liu, Chao; Lu, Chunsheng; Elsworth, Derek
2016-06-16
A theoretical explanation of a time-to-failure relation is presented, with this relationship then used to describe the failure of materials. This provides the potential to predict timing (tf - t) immediately before failure by extrapolating the trajectory as it asymptotes to zero with no need to fit unknown exponents as previously proposed in critical power law behaviors. This generalized relation is verified by comparison with approaches to criticality for volcanic eruptions and creep failure. A new relation based on changes with stress is proposed as an alternative expression of Voight's relation, which is widely used to describe the accelerating precursory signals before material failure and broadly applied to volcanic eruptions, landslides and other phenomena. The new generalized relation reduces to Voight's relation if stress is limited to increase at a constant rate with time. This implies that the time-derivatives in Voight's analysis may be a subset of a more general expression connecting stress derivatives, and thus provides a potential method for forecasting these events.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Kissinger, Patricia; White, Scott; Manhart, Lisa E.; Schwebke, Jane; Taylor, Stephanie N; Mena, Leandro; Khosropour, Christine M; Wilcox, Larissa; Schmidt, Norine; Martin, David H
2016-01-01
Background Three recent prospective studies have suggested that the 1 g dose of azithromycin for Chlamydia trachomatis (Ct) was less effective than expected, reporting a wide range of treatment failure rates (5.8%–22.6%). Reasons for the disparate results could be attributed to geographic or methodological differences. The purpose of this study was to re-examine the studies and attempt to harmonize methodologies to reduce misclassification as a result of false positives from early test-of-cure (TOC) or reinfection as a result of sexual exposure rather than treatment failure. Methods Men who had sex with women, who received 1 g azithromycin under directly observed therapy (DOT) for presumptive treatment of nongonococcal urethritis (NGU) with confirmed Ct were included. Baseline screening was performed on urethral swabs or urine and TOC screening was performed on urine using nucleic acid amplification tests (NAAT). Post-treatment vaginal sexual exposure was elicited at TOC. Data from the three studies was obtained and re-analyzed. Rates of Ct re-test positive were examined for all cases and a sensitivity analysis was conducted to either reclassify potential false positives/reinfections as negative or remove them from the analysis. Results The crude treatment failure rate was 12.8% (31/242). The rate when potential false positives/reinfections were reclassified as negative was 6.2% (15/242) or when these were excluded from analysis was 10.9% (15/138). Conclusion In these samples of men who have sex with women with Ct-related NGU, azithromycin treatment failure was between 6.2% and 12.8%. This range of failure is lower than previously published but higher than the desired World Health Organization’s target chlamydia treatment failure rate of < 5%. PMID:27631353
Plasma process control with optical emission spectroscopy
NASA Astrophysics Data System (ADS)
Ward, P. P.
Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.
Acoustic method of damage sensing in composite materials
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Walker, James; Lansing, Matthew
1994-01-01
The use of acoustic emission and acousto-ultrasonics to characterize impact damage in composite structures is being performed on both graphite epoxy and kevlar bottles. Further development of the acoustic emission methodology to include neural net analysis and/or other multivariate techniques will enhance the capability of the technique to identify failure mechanisms during fracture. The acousto-ultrasonics technique will be investigated to determine its ability to predict regions prone to failure prior to the burst tests. The combination of the two methods will allow for simple nondestructive tests to be capable of predicting the performance of a composite structure prior to being placed in service and during service.
Simulating Fatigue Crack Growth in Spiral Bevel Pinion
NASA Technical Reports Server (NTRS)
Ural, Ani; Wawrzynek, Paul A.; Ingraffe, Anthony R.
2003-01-01
This project investigates computational modeling of fatigue crack growth in spiral bevel gears. Current work is a continuation of the previous efforts made to use the Boundary Element Method (BEM) to simulate tooth-bending fatigue failure in spiral bevel gears. This report summarizes new results predicting crack trajectory and fatigue life for a spiral bevel pinion using the Finite Element Method (FEM). Predicting crack trajectories is important in determining the failure mode of a gear. Cracks propagating through the rim may result in catastrophic failure, whereas the gear may remain intact if one tooth fails and this may allow for early detection of failure. Being able to predict crack trajectories is insightful for the designer. However, predicting growth of three-dimensional arbitrary cracks is complicated due to the difficulty of creating three-dimensional models, the computing power required, and absence of closed- form solutions of the problem. Another focus of this project was performing three-dimensional contact analysis of a spiral bevel gear set incorporating cracks. These analyses were significant in determining the influence of change of tooth flexibility due to crack growth on the magnitude and location of contact loads. This is an important concern since change in contact loads might lead to differences in SIFs and therefore result in alteration of the crack trajectory. Contact analyses performed in this report showed the expected trend of decreasing tooth loads carried by the cracked tooth with increasing crack length. Decrease in tooth loads lead to differences between SIFs extracted from finite element contact analysis and finite element analysis with Hertz contact loads. This effect became more pronounced as the crack grew.
Can complexity decrease in congestive heart failure?
NASA Astrophysics Data System (ADS)
Mukherjee, Sayan; Palit, Sanjay Kumar; Banerjee, Santo; Ariffin, M. R. K.; Rondoni, Lamberto; Bhattacharya, D. K.
2015-12-01
The complexity of a signal can be measured by the Recurrence period density entropy (RPDE) from the reconstructed phase space. We have chosen a window based RPDE method for the classification of signals, as RPDE is an average entropic measure of the whole phase space. We have observed the changes in the complexity in cardiac signals of normal healthy person (NHP) and congestive heart failure patients (CHFP). The results show that the cardiac dynamics of a healthy subject is more complex and random compare to the same for a heart failure patient, whose dynamics is more deterministic. We have constructed a general threshold to distinguish the border line between a healthy and a congestive heart failure dynamics. The results may be useful for wide range for physiological and biomedical analysis.
Experiences with Probabilistic Analysis Applied to Controlled Systems
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Giesy, Daniel P.
2004-01-01
This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.
The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.
2010-01-01
HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.
Holden, Richard J; Kulanthaivel, Anand; Purkayastha, Saptarshi; Goggins, Kathryn M; Kripalani, Sunil
2017-12-01
Personas are a canonical user-centered design method increasingly used in health informatics research. Personas-empirically-derived user archetypes-can be used by eHealth designers to gain a robust understanding of their target end users such as patients. To develop biopsychosocial personas of older patients with heart failure using quantitative analysis of survey data. Data were collected using standardized surveys and medical record abstraction from 32 older adults with heart failure recently hospitalized for acute heart failure exacerbation. Hierarchical cluster analysis was performed on a final dataset of n=30. Nonparametric analyses were used to identify differences between clusters on 30 clustering variables and seven outcome variables. Six clusters were produced, ranging in size from two to eight patients per cluster. Clusters differed significantly on these biopsychosocial domains and subdomains: demographics (age, sex); medical status (comorbid diabetes); functional status (exhaustion, household work ability, hygiene care ability, physical ability); psychological status (depression, health literacy, numeracy); technology (Internet availability); healthcare system (visit by home healthcare, trust in providers); social context (informal caregiver support, cohabitation, marital status); and economic context (employment status). Tabular and narrative persona descriptions provide an easy reference guide for informatics designers. Personas development using approaches such as clustering of structured survey data is an important tool for health informatics professionals. We describe insights from our study of patients with heart failure, then recommend a generic ten-step personas development process. Methods strengths and limitations of the study and of personas development generally are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Nonlinear and progressive failure aspects of transport composite fuselage damage tolerance
NASA Technical Reports Server (NTRS)
Walker, Tom; Ilcewicz, L.; Murphy, Dan; Dopker, Bernhard
1993-01-01
The purpose is to provide an end-user's perspective on the state of the art in life prediction and failure analysis by focusing on subsonic transport fuselage issues being addressed in the NASA/Boeing Advanced Technology Composite Aircraft Structure (ATCAS) contract and a related task-order contract. First, some discrepancies between the ATCAS tension-fracture test database and classical prediction methods is discussed, followed by an overview of material modeling work aimed at explaining some of these discrepancies. Finally, analysis efforts associated with a pressure-box test fixture are addressed, as an illustration of modeling complexities required to model and interpret tests.
Application of micropolar plasticity to post failure analysis in geomechanics
NASA Astrophysics Data System (ADS)
Manzari, Majid T.
2004-08-01
A micropolar elastoplastic model for soils is formulated and a series of finite element analyses are employed to demonstrate the use of a micropolar continuum in overcoming the numerical difficulties encountered in application of finite element method in standard Cauchy-Boltzmann continuum. Three examples of failure analysis involving a deep excavation, shallow foundation, and a retaining wall are presented. In all these cases, it is observed that the length scale introduced in the polar continuum regularizes the incremental boundary value problem and allows the numerical simulation to be continued until a clear collapse mechanism is achieved. The issue of grain size effect is also discussed. Copyright
Micromechanics Based Failure Analysis of Heterogeneous Materials
NASA Astrophysics Data System (ADS)
Sertse, Hamsasew M.
In recent decades, heterogeneous materials are extensively used in various industries such as aerospace, defense, automotive and others due to their desirable specific properties and excellent capability of accumulating damage. Despite their wide use, there are numerous challenges associated with the application of these materials. One of the main challenges is lack of accurate tools to predict the initiation, progression and final failure of these materials under various thermomechanical loading conditions. Although failure is usually treated at the macro and meso-scale level, the initiation and growth of failure is a complex phenomena across multiple scales. The objective of this work is to enable the mechanics of structure genome (MSG) and its companion code SwiftComp to analyze the initial failure (also called static failure), progressive failure, and fatigue failure of heterogeneous materials using micromechanics approach. The initial failure is evaluated at each numerical integration point using pointwise and nonlocal approach for each constituent of the heterogeneous materials. The effects of imperfect interfaces among constituents of heterogeneous materials are also investigated using a linear traction-displacement model. Moreover, the progressive and fatigue damage analyses are conducted using continuum damage mechanics (CDM) approach. The various failure criteria are also applied at a material point to analyze progressive damage in each constituent. The constitutive equation of a damaged material is formulated based on a consistent irreversible thermodynamics approach. The overall tangent modulus of uncoupled elastoplastic damage for negligible back stress effect is derived. The initiation of plasticity and damage in each constituent is evaluated at each numerical integration point using a nonlocal approach. The accumulated plastic strain and anisotropic damage evolution variables are iteratively solved using an incremental algorithm. The damage analyses are performed for both brittle failure/high cycle fatigue (HCF) for negligible plastic strain and ductile failure/low cycle fatigue (LCF) for large plastic strain. The proposed approach is incorporated in SwiftComp and used to predict the initial failure envelope, stress-strain curve for various loading conditions, and fatigue life of heterogeneous materials. The combined effects of strain hardening and progressive fatigue damage on the effective properties of heterogeneous materials are also studied. The capability of the current approach is validated using several representative examples of heterogeneous materials including binary composites, continuous fiber-reinforced composites, particle-reinforced composites, discontinuous fiber-reinforced composites, and woven composites. The predictions of MSG are also compared with the predictions obtained using various micromechanics approaches such as Generalized Methods of Cells (GMC), Mori-Tanaka (MT), and Double Inclusions (DI) and Representative Volume Element (RVE) Analysis (called as 3-dimensional finite element analysis (3D FEA) in this document). This study demonstrates that a micromechanics based failure analysis has a great potential to rigorously and more accurately analyze initiation and progression of damage in heterogeneous materials. However, this approach requires material properties specific to damage analysis, which are needed to be independently calibrated for each constituent.
Del Mazo-Barbara, Anna; Mirabel, Clémentine; Nieto, Valentín; Reyes, Blanca; García-López, Joan; Oliver-Vila, Irene; Vives, Joaquim
2016-09-01
Computerized systems (CS) are essential in the development and manufacture of cell-based medicines and must comply with good manufacturing practice, thus pushing academic developers to implement methods that are typically found within pharmaceutical industry environments. Qualitative and quantitative risk analyses were performed by Ishikawa and Failure Mode and Effects Analysis, respectively. A process for qualification of a CS that keeps track of environmental conditions was designed and executed. The simplicity of the Ishikawa analysis permitted to identify critical parameters that were subsequently quantified by Failure Mode Effects Analysis, resulting in a list of test included in the qualification protocols. The approach presented here contributes to simplify and streamline the qualification of CS in compliance with pharmaceutical quality standards.
Seismic performance evaluation of RC frame-shear wall structures using nonlinear analysis methods
NASA Astrophysics Data System (ADS)
Shi, Jialiang; Wang, Qiuwei
To further understand the seismic performance of reinforced concrete (RC) frame-shear wall structures, a 1/8 model structure is scaled from a main factory structure with seven stories and seven bays. The model with four-stories and two-bays was pseudo-dynamically tested under six earthquake actions whose peak ground accelerations (PGA) vary from 50gal to 400gal. The damage process and failure patterns were investigated. Furthermore, nonlinear dynamic analysis (NDA) and capacity spectrum method (CSM) were adopted to evaluate the seismic behavior of the model structure. The top displacement curve, story drift curve and distribution of hinges were obtained and discussed. It is shown that the model structure had the characteristics of beam-hinge failure mechanism. The two methods can be used to evaluate the seismic behavior of RC frame-shear wall structures well. What’s more, the NDA can be somewhat replaced by CSM for the seismic performance evaluation of RC structures.
Methods for improved forewarning of condition changes in monitoring physical processes
Hively, Lee M.
2013-04-09
This invention teaches further improvements in methods for forewarning of critical events via phase-space dissimilarity analysis of data from biomedical equipment, mechanical devices, and other physical processes. One improvement involves objective determination of a forewarning threshold (U.sub.FW), together with a failure-onset threshold (U.sub.FAIL) corresponding to a normalized value of a composite measure (C) of dissimilarity; and providing a visual or audible indication to a human observer of failure forewarning and/or failure onset. Another improvement relates to symbolization of the data according the binary numbers representing the slope between adjacent data points. Another improvement relates to adding measures of dissimilarity based on state-to-state dynamical changes of the system. And still another improvement relates to using a Shannon entropy as the measure of condition change in lieu of a connected or unconnected phase space.
NASA Technical Reports Server (NTRS)
Smart, Christian
1998-01-01
During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).
Masterson Creber, Ruth; Patey, Megan; Dickson, Victoria Vaughan; DeCesaris, Marissa; Riegel, Barbara
2015-03-01
Lack of engagement in self-care is common among patients needing to follow a complex treatment regimen, especially patients with heart failure who are affected by comorbidity, disability and side effects of poly-pharmacy. The purpose of Motivational Interviewing Tailored Interventions for Heart Failure (MITI-HF) is to test the feasibility and comparative efficacy of an MI intervention on self-care, acute heart failure physical symptoms and quality of life. We are conducting a brief, nurse-led motivational interviewing randomized controlled trial to address behavioral and motivational issues related to heart failure self-care. Participants in the intervention group receive home and phone-based motivational interviewing sessions over 90-days and those in the control group receive care as usual. Participants in both groups receive patient education materials. The primary study outcome is change in self-care maintenance from baseline to 90-days. This article presents the study design, methods, plans for statistical analysis and descriptive characteristics of the study sample for MITI-HF. Study findings will contribute to the literature on the efficacy of motivational interviewing to promote heart failure self-care. We anticipate that using an MI approach can help patients with heart failure focus on their internal motivation to change in a non-confrontational, patient-centered and collaborative way. It also affirms their ability to practice competent self-care relevant to their personal health goals. Copyright © 2015 Elsevier Inc. All rights reserved.
Homer, Michael V.; Charo, Lindsey M.; Natarajan, Loki; Haunschild, Carolyn; Chung, Karine; Mao, Jun J.; DeMichele, Angela M.; Su, H. Irene
2016-01-01
Objective To determine if inter-individual genetic variation in single nucleotide polymorphisms related to age at natural menopause are associated with risk of ovarian failure in breast cancer survivors. Methods A prospective cohort of 169 premenopausal breast cancer survivors recruited at diagnosis with Stages 0 to III disease were followed longitudinally for menstrual pattern via self-reported daily menstrual diaries. Participants were genotyped for 13 single nucleotide polymorphisms (SNPs) previously found to be associated with age at natural menopause: EXO1, TLK1, HELQ, UIMC1, PRIM1, POLG, TMEM224, BRSK1, and MCM8. A risk variable summed the total number of risk alleles in each participant. The association between individual genotypes, as well as the risk variable, and time to ovarian failure (> 12 months of amenorrhea) was tested using time-to-event methods. Results Median age at enrollment was 40.5 years old (range 20.6–46.1). The majority of participants were white (69%) and underwent chemotherapy (76%). Thirty-eight participants (22%) experienced ovarian failure. None of the candidate SNPs or the summary risk variable were significantly associated with time to ovarian failure. Sensitivity analysis restricted to whites or only to participants receiving chemotherapy yielded similar findings. Older age, chemotherapy exposure and lower BMI were related to shorter time to ovarian failure. Conclusions Thirteen previously identified genetic variants associated with time to natural menopause were not related to timing of ovarian failure in breast cancer survivors. PMID:28118297
Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.
2011-01-01
This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.
Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2011-01-01
This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.
Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks
NASA Astrophysics Data System (ADS)
Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.
2017-12-01
The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balboni, Tracy A.; Gaccione, Peter; Gobezie, Reuben
2007-04-01
Purpose: Radiation therapy (RT) is frequently administered to prevent heterotopic ossification (HO) after total hip arthroplasty (THA). The purpose of this study was to determine if there is an increased risk of HO after RT prophylaxis with shielding of the THA components. Methods and Materials: This is a retrospective analysis of THA patients undergoing RT prophylaxis of HO at Brigham and Women's Hospital between June 1994 and February 2004. Univariate and multivariate logistic regressions were used to assess the relationships of all variables to failure of RT prophylaxis. Results: A total of 137 patients were identified and 84 were eligiblemore » for analysis (61%). The median RT dose was 750 cGy in one fraction, and the median follow-up was 24 months. Eight of 40 unshielded patients (20%) developed any progression of HO compared with 21 of 44 shielded patients (48%) (p = 0.009). Brooker Grade III-IV HO developed in 5% of unshielded and 18% of shielded patients (p 0.08). Multivariate analysis revealed shielding (p = 0.02) and THA for prosthesis infection (p = 0.03) to be significant predictors of RT failure, with a trend toward an increasing risk of HO progression with age (p = 0.07). There was no significant difference in the prosthesis failure rates between shielded and unshielded patients. Conclusions: A significantly increased risk of failure of RT prophylaxis for HO was noted in those receiving shielding of the hip prosthesis. Shielding did not appear to reduce the risk of prosthesis failure.« less
Reliability Coupled Sensitivity Based Design Approach for Gravity Retaining Walls
NASA Astrophysics Data System (ADS)
Guha Ray, A.; Baidya, D. K.
2012-09-01
Sensitivity analysis involving different random variables and different potential failure modes of a gravity retaining wall focuses on the fact that high sensitivity of a particular variable on a particular mode of failure does not necessarily imply a remarkable contribution to the overall failure probability. The present paper aims at identifying a probabilistic risk factor ( R f ) for each random variable based on the combined effects of failure probability ( P f ) of each mode of failure of a gravity retaining wall and sensitivity of each of the random variables on these failure modes. P f is calculated by Monte Carlo simulation and sensitivity analysis of each random variable is carried out by F-test analysis. The structure, redesigned by modifying the original random variables with the risk factors, is safe against all the variations of random variables. It is observed that R f for friction angle of backfill soil ( φ 1 ) increases and cohesion of foundation soil ( c 2 ) decreases with an increase of variation of φ 1 , while R f for unit weights ( γ 1 and γ 2 ) for both soil and friction angle of foundation soil ( φ 2 ) remains almost constant for variation of soil properties. The results compared well with some of the existing deterministic and probabilistic methods and found to be cost-effective. It is seen that if variation of φ 1 remains within 5 %, significant reduction in cross-sectional area can be achieved. But if the variation is more than 7-8 %, the structure needs to be modified. Finally design guidelines for different wall dimensions, based on the present approach, are proposed.
Failure Modes and Effects Analysis of bilateral same-day cataract surgery
Shorstein, Neal H.; Lucido, Carol; Carolan, James; Liu, Liyan; Slean, Geraldine; Herrinton, Lisa J.
2017-01-01
PURPOSE To systematically analyze potential process failures related to bilateral same-day cataract surgery toward the goal of improving patient safety. SETTING Twenty-one Kaiser Permanente surgery centers, Northern California, USA. DESIGN Retrospective cohort study. METHODS Quality experts performed a Failure Modes and Effects Analysis (FMEA) that included an evaluation of sterile processing, pharmaceuticals, perioperative clinic and surgical center visits, and biometry. Potential failures in human factors and communication (modes) were identified. Rates of endophthalmitis, toxic anterior segment syndrome (TASS), and unintended intraocular lens (IOL) implantation were assessed in eyes having bilateral same-day surgery from 2010 through 2014. RESULTS The study comprised 4754 eyes. The analysis identified 15 significant potential failure modes. These included lapses in instrument processing and compounding error of intracameral antibiotic that could lead to endophthalmitis or TASS and ambiguous documentation of IOL selection by surgeons, which could lead to unintended IOL implantation. Of the study sample, 1 eye developed endophthalmitis, 1 eye had unintended IOL implantation (rates, 2 per 10 000; 95% confidence intervals [CI] 0.1–12.0 per 10 000), and no eyes developed TASS (upper 95% CI, 8 per 10 000). Recommendations included improving oversight of cleaning and sterilization practices, separating lots of compounded drugs for each eye, and enhancing IOL verification procedures. CONCLUSIONS Potential failure modes and recommended actions in bilateral same-day cataract surgery were determined using a FMEA. These findings might help improve the reliability and safety of bilateral same-day cataract surgery based on current evidence and standards. PMID:28410711
SU-F-R-20: Image Texture Features Correlate with Time to Local Failure in Lung SBRT Patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, M; Abazeed, M; Woody, N
Purpose: To explore possible correlation between CT image-based texture and histogram features and time-to-local-failure in early stage non-small cell lung cancer (NSCLC) patients treated with stereotactic body radiotherapy (SBRT).Methods and Materials: From an IRB-approved lung SBRT registry for patients treated between 2009–2013 we selected 48 (20 male, 28 female) patients with local failure. Median patient age was 72.3±10.3 years. Mean time to local failure was 15 ± 7.1 months. Physician-contoured gross tumor volumes (GTV) on the planning CT images were processed and 3D gray-level co-occurrence matrix (GLCM) based texture and histogram features were calculated in Matlab. Data were exported tomore » R and a multiple linear regression model was used to examine the relationship between texture features and time-to-local-failure. Results: Multiple linear regression revealed that entropy (p=0.0233, multiple R2=0.60) from GLCM-based texture analysis and the standard deviation (p=0.0194, multiple R2=0.60) from the histogram-based features were statistically significantly correlated with the time-to-local-failure. Conclusion: Image-based texture analysis can be used to predict certain aspects of treatment outcomes of NSCLC patients treated with SBRT. We found entropy and standard deviation calculated for the GTV on the CT images displayed a statistically significant correlation with and time-to-local-failure in lung SBRT patients.« less
Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.
2012-12-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
Vaughan, Gilberto; Forbi, Joseph C; Xia, Guo-Liang; Fonseca-Ford, Maureen; Vazquez, Roberto; Khudyakov, Yury E; Montiel, Sonia; Waterman, Steve; Alpuche, Celia; Gonçalves Rossi, Livia Maria; Luna, Norma
2014-02-01
Clinical infection by hepatitis A virus (HAV) is generally self-limited but in some cases can progress to liver failure. Here, an HAV outbreak investigation among children with acute liver failure in a highly endemic country is presented. In addition, a sensitive method for HAV whole genome amplification and sequencing suitable for analysis of clinical samples is described. In this setting, two fatal cases attributed to acute liver failure and two asymptomatic cases living in the same household were identified. In a second household, one HAV case was observed with jaundice which resolved spontaneously. Partial molecular characterization showed that both households were infected by HAV subtype IA; however, the infecting strains in the two households were different. The HAV outbreak strains recovered from all cases grouped together within cluster IA1, which contains closely related HAV strains from the United States commonly associated with international travelers. Full-genome HAV sequences obtained from the household with the acute liver failure cases were related (genetic distances ranging from 0.01% to 0.04%), indicating a common-source infection. Interestingly, the strain recovered from the asymptomatic household contact was nearly identical to the strain causing acute liver failure. The whole genome sequence from the case in the second household was distinctly different from the strains associated with acute liver failure. Thus, infection with almost identical HAV strains resulted in drastically different clinical outcomes. © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Brideau, Marc-André; Yan, Ming; Stead, Doug
2009-01-01
Rock slope failures are frequently controlled by a complex combination of discontinuities that facilitate kinematic release. These discontinuities are often associated with discrete folds, faults, and shear zones, and/or related tectonic damage. The authors, through detailed case studies, illustrate the importance of considering the influence of tectonic structures not only on three-dimensional kinematic release but also in the reduction of rock mass properties due to induced damage. The case studies selected reflect a wide range of rock mass conditions. In addition to active rock slope failures they include two major historic failures, the Hope Slide, which occurred in British Columbia in 1965 and the Randa rockslides which occurred in Switzerland in 1991. Detailed engineering geological mapping combined with rock testing, GIS data analysis and for selected case numerical modelling, have shown that specific rock slope failure mechanisms may be conveniently related to rock mass classifications such as the Geological Strength Index (GSI). The importance of brittle intact rock fracture in association with pre-existing rock mass damage is emphasized though a consideration of the processes involved in the progressive-time dependent development not only of though-going failure surfaces but also lateral and rear-release mechanisms. Preliminary modelling data are presented to illustrate the importance of intact rock fracture and step-path failure mechanisms; and the results are discussed with reference to selected field observations. The authors emphasize the importance of considering all forms of pre-existing rock mass damage when assessing potential or operative failure mechanisms. It is suggested that a rock slope rock mass damage assessment can provide an improved understanding of the potential failure mode, the likely hazard presented, and appropriate methods of both analysis and remedial treatment.
NASA Technical Reports Server (NTRS)
Ehret, R. M.
1974-01-01
The concepts explored in a state of the art review of those engineering fracture mechanics considered most applicable to the space shuttle vehicle include fracture toughness, precritical flaw growth, failure mechanisms, inspection methods (including proof test logic), and crack growth predictive analysis techniques.
Experimental methods for identifying failure mechanisms
NASA Technical Reports Server (NTRS)
Daniel, I. M.
1983-01-01
Experimental methods for identifying failure mechanisms in fibrous composites are studied. Methods to identify failure in composite materials includes interferometry, holography, fractography and ultrasonics.
Characterization of emission microscopy and liquid crystal thermography in IC fault localization
NASA Astrophysics Data System (ADS)
Lau, C. K.; Sim, K. S.
2013-05-01
This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.
Failure mode and effect analysis-based quality assurance for dynamic MLC tracking systems
Sawant, Amit; Dieterich, Sonja; Svatos, Michelle; Keall, Paul
2010-01-01
Purpose: To develop and implement a failure mode and effect analysis (FMEA)-based commissioning and quality assurance framework for dynamic multileaf collimator (DMLC) tumor tracking systems. Methods: A systematic failure mode and effect analysis was performed for a prototype real-time tumor tracking system that uses implanted electromagnetic transponders for tumor position monitoring and a DMLC for real-time beam adaptation. A detailed process tree of DMLC tracking delivery was created and potential tracking-specific failure modes were identified. For each failure mode, a risk probability number (RPN) was calculated from the product of the probability of occurrence, the severity of effect, and the detectibility of the failure. Based on the insights obtained from the FMEA, commissioning and QA procedures were developed to check (i) the accuracy of coordinate system transformation, (ii) system latency, (iii) spatial and dosimetric delivery accuracy, (iv) delivery efficiency, and (v) accuracy and consistency of system response to error conditions. The frequency of testing for each failure mode was determined from the RPN value. Results: Failures modes with RPN≥125 were recommended to be tested monthly. Failure modes with RPN<125 were assigned to be tested during comprehensive evaluations, e.g., during commissioning, annual quality assurance, and after major software∕hardware upgrades. System latency was determined to be ∼193 ms. The system showed consistent and accurate response to erroneous conditions. Tracking accuracy was within 3%–3 mm gamma (100% pass rate) for sinusoidal as well as a wide variety of patient-derived respiratory motions. The total time taken for monthly QA was ∼35 min, while that taken for comprehensive testing was ∼3.5 h. Conclusions: FMEA proved to be a powerful and flexible tool to develop and implement a quality management (QM) framework for DMLC tracking. The authors conclude that the use of FMEA-based QM ensures efficient allocation of clinical resources because the most critical failure modes receive the most attention. It is expected that the set of guidelines proposed here will serve as a living document that is updated with the accumulation of progressively more intrainstitutional and interinstitutional experience with DMLC tracking. PMID:21302802
Walter, Martin A; Briel, Matthias; Christ-Crain, Mirjam; Bonnema, Steen J; Connell, John; Cooper, David S; Bucher, Heiner C; Müller-Brand, Jan; Müller, Beat
2007-03-10
To determine the effect of adjunctive antithyroid drugs on the risk of treatment failure, hypothyroidism, and adverse events after radioiodine treatment. Meta-analysis. Electronic databases (Cochrane central register of controlled trials, Medline, Embase) searched to August 2006 and contact with experts. Review methods Three reviewers independently assessed trial eligibility and quality. Pooled relative risks for treatment failure and hypothyroidism after radioiodine treatment with and without adjunctive antithyroid drugs were calculated with a random effects model. We identified 14 relevant randomised controlled trials with a total of 1306 participants. Adjunctive antithyroid medication was associated with an increased risk of treatment failure (relative risk 1.28, 95% confidence interval 1.07 to 1.52; P=0.006) and a reduced risk for hypothyroidism (0.68, 0.53 to 0.87; P=0.006) after radioiodine treatment. We found no difference in summary estimates for the different antithyroid drugs or for whether antithyroid drugs were given before or after radioiodine treatment. Antithyroid drugs potentially increase rates of failure and reduce rates of hypothyroidism if they are given in the week before or after radioiodine treatment, respectively.
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
A Diagnostic Approach for Electro-Mechanical Actuators in Aerospace Systems
NASA Technical Reports Server (NTRS)
Balaban, Edward; Saxena, Abhinav; Bansal, Prasun; Goebel, Kai Frank; Stoelting, Paul; Curran, Simon
2009-01-01
Electro-mechanical actuators (EMA) are finding increasing use in aerospace applications, especially with the trend towards all all-electric aircraft and spacecraft designs. However, electro-mechanical actuators still lack the knowledge base accumulated for other fielded actuator types, particularly with regard to fault detection and characterization. This paper presents a thorough analysis of some of the critical failure modes documented for EMAs and describes experiments conducted on detecting and isolating a subset of them. The list of failures has been prepared through an extensive Failure Modes and Criticality Analysis (FMECA) reference, literature review, and accessible industry experience. Methods for data acquisition and validation of algorithms on EMA test stands are described. A variety of condition indicators were developed that enabled detection, identification, and isolation among the various fault modes. A diagnostic algorithm based on an artificial neural network is shown to operate successfully using these condition indicators and furthermore, robustness of these diagnostic routines to sensor faults is demonstrated by showing their ability to distinguish between them and component failures. The paper concludes with a roadmap leading from this effort towards developing successful prognostic algorithms for electromechanical actuators.
Dam break analysis and flood inundation map of Krisak dam for emergency action plan
NASA Astrophysics Data System (ADS)
Juliastuti, Setyandito, Oki
2017-11-01
The Indonesian Regulation which refers to the ICOLD Regulation (International Committee on Large Dam required have the Emergency Action Plan (EAP) guidelines because of the dams have potential failure. In EAP guidelines there is a management of evacuation where the determination of the inundation map based on flood modeling. The purpose of the EAP is to minimize the risk of loss of life and property in downstream which caused by dam failure. This paper will describe about develop flood modeling and inundation map in Krisak dam using numerical methods through dam break analysis (DBA) using hydraulic model Zhong Xing HY-21. The approaches of dam failure simulation are overtopping and piping. Overtopping simulation based on quadrangular, triangular and trapezium fracture. Piping simulation based on cracks of orifice. Using results of DBA, hazard classification of Krisak dam is very high. The nearest village affected dam failure is Singodutan village (distance is 1.45 kilometer from dam) with inundation depth is 1.85 meter. This result can be used by stakeholders such as emergency responders and the community at risk in formulating evacuation procedure.
A Meta-Analysis of Predictors of Offender Treatment Attrition and Its Relationship to Recidivism
ERIC Educational Resources Information Center
Olver, Mark E.; Stockdale, Keira C.; Wormith, J. Stephen
2011-01-01
Objective: The failure of offenders to complete psychological treatment can pose significant concerns, including increased risk for recidivism. Although a large literature identifying predictors of offender treatment attrition has accumulated, there has yet to be a comprehensive quantitative review. Method: A meta-analysis of the offender…
NASA Astrophysics Data System (ADS)
Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao
2017-01-01
Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.
Comparison of three methods of feeding colostrum to dairy calves.
Besser, T E; Gay, C C; Pritchett, L
1991-02-01
Absorption of colostral immunoglobulins by Holstein calves was studied in 3 herds in which 3 methods of colostrum feeding were used. Failure of passive transfer, as determined by calf serum immunoglobulin G1 (IgG1) concentration less than 10 mg/ml at 48 hours of age, was diagnosed in 61.4% of calves from a dairy in which calves were nursed by their dams, 19.3% of calves from a dairy using nipple-bottle feeding, and 10.8% of calves from a dairy using tube feeding. The management factor determined to have the greatest influence on the probability of failure of passive transfer in the herds using artificial methods of colostrum feeding (bottle feeding or tube feeding) was the volume of colostrum fed as it affected the amount of IgG1 received by the calf. In dairies that used artificial feeding methods, failure of passive transfer was infrequent in calves fed greater than or equal to 100 g IgG1 in the first colostrum feeding. In the dairy that allowed calves to suckle, prevalence of failure of passive transfer was greater than 50% even among calves nursed by cows with above-average colostral IgG1 concentration. Analysis of the effect of other management factors on calf immunoglobulin absorption revealed small negative effects associated with the use of previously frozen colostrum and the use of colostrum from cows with long nonlactating intervals.
Taheriyoun, Masoud; Moradinejad, Saber
2015-01-01
The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.
ADM guidance-Ceramics: guidance to the use of fractography in failure analysis of brittle materials.
Scherrer, Susanne S; Lohbauer, Ulrich; Della Bona, Alvaro; Vichi, Alessandro; Tholey, Michael J; Kelly, J Robert; van Noort, Richard; Cesar, Paulo Francisco
2017-06-01
To provide background information and guidance as to how to use fractography accurately, a powerful tool for failure analysis of dental ceramic structures. An extended palette of qualitative and quantitative fractography is provided, both for in vivo and in vitro fracture surface analyses. As visual support, this guidance document will provide micrographs of typical critical ceramic processing flaws, differentiating between pre- versus post sintering cracks, grinding damage related failures and occlusal contact wear origins and of failures due to surface degradation. The documentation emphasizes good labeling of crack features, precise indication of the direction of crack propagation (dcp), identification of the fracture origin, the use of fractographic photomontage of critical flaws or flaw labeling on strength data graphics. A compilation of recommendations for specific applications of fractography in Dentistry is also provided. This guidance document will contribute to a more accurate use of fractography and help researchers to better identify, describe and understand the causes of failure, for both clinical and laboratory-scale situations. If adequately performed at a large scale, fractography will assist in optimizing the methods of processing and designing of restorative materials and components. Clinical failures may be better understood and consequently reduced by sending out the correct message regarding the fracture origin in clinical trials. Copyright © 2017 The Academy of Dental Materials. All rights reserved.
Parameter estimation in Cox models with missing failure indicators and the OPPERA study.
Brownstein, Naomi C; Cai, Jianwen; Slade, Gary D; Bair, Eric
2015-12-30
In a prospective cohort study, examining all participants for incidence of the condition of interest may be prohibitively expensive. For example, the "gold standard" for diagnosing temporomandibular disorder (TMD) is a physical examination by a trained clinician. In large studies, examining all participants in this manner is infeasible. Instead, it is common to use questionnaires to screen for incidence of TMD and perform the "gold standard" examination only on participants who screen positively. Unfortunately, some participants may leave the study before receiving the "gold standard" examination. Within the framework of survival analysis, this results in missing failure indicators. Motivated by the Orofacial Pain: Prospective Evaluation and Risk Assessment (OPPERA) study, a large cohort study of TMD, we propose a method for parameter estimation in survival models with missing failure indicators. We estimate the probability of being an incident case for those lacking a "gold standard" examination using logistic regression. These estimated probabilities are used to generate multiple imputations of case status for each missing examination that are combined with observed data in appropriate regression models. The variance introduced by the procedure is estimated using multiple imputation. The method can be used to estimate both regression coefficients in Cox proportional hazard models as well as incidence rates using Poisson regression. We simulate data with missing failure indicators and show that our method performs as well as or better than competing methods. Finally, we apply the proposed method to data from the OPPERA study. Copyright © 2015 John Wiley & Sons, Ltd.
Investigation of the Mechanism of Roof Caving in the Jinchuan Nickel Mine, China
NASA Astrophysics Data System (ADS)
Ding, Kuo; Ma, Fengshan; Guo, Jie; Zhao, Haijun; Lu, Rong; Liu, Feng
2018-04-01
On 13 March 2016, a sudden, violent roof caving event with a collapse area of nearly 11,000 m2 occurred in the Jinchuan Nickel Mine and accompanied by air blasts, loud noises and ground vibrations. This collapse event coincided with related, conspicuous surface subsidence across an area of nearly 19,000 m2. This article aims to analyse this collapse event. In previous studies, various mining-induced collapses have been studied, but collapse accidents associated with the filling mining method are very rare and have not been thoroughly studied. The filling method has been regarded as a safe mining method for a long time, so research on associated collapse mechanisms is of considerable significance. In this study, a detailed field investigation of roadway damage was performed, and GPS monitoring results were used to analyse the surface failure. In addition, a numerical model was constructed based on the geometry of the ore body and a major fault. The analysis of the model revealed three failure mechanisms acting during different stages of destruction: double-sided embedded beam deformation, fault activation, and cantilever-articulated rock beam failure. The fault activation and the specific filling method are the key factors of this collapse event. To gain a better understanding of these factors, the shear stress and normal stress along the fault plane were monitored to determine the variation in stress at different failure stages. Discrete element models were established to study two filling methods and to analyse the stability of different filling structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaffer, Richard, E-mail: rickyshaffer@yahoo.co.u; Department of Clinical Oncology, Imperial College London National Health Service Trust, London; Pickles, Tom
Purpose: Prior studies have derived low values of alpha-beta ratio (a/ss) for prostate cancer of approximately 1-2 Gy. These studies used poorly matched groups, differing definitions of biochemical failure, and insufficient follow-up. Methods and Materials: National Comprehensive Cancer Network low- or low-intermediate risk prostate cancer patients, treated with external beam radiotherapy or permanent prostate brachytherapy, were matched for prostate-specific antigen, Gleason score, T-stage, percentage of positive cores, androgen deprivation therapy, and era, yielding 118 patient pairs. The Phoenix definition of biochemical failure was used. The best-fitting value for a/ss was found for up to 90-month follow-up using maximum likelihood analysis,more » and the 95% confidence interval using the profile likelihood method. Linear quadratic formalism was applied with the radiobiological parameters of relative biological effectiveness = 1.0, potential doubling time = 45 days, and repair half-time = 1 hour. Bootstrap analysis was performed to estimate uncertainties in outcomes, and hence in a/ss. Sensitivity analysis was performed by varying the values of the radiobiological parameters to extreme values. Results: The value of a/ss best fitting the outcomes data was >30 Gy, with lower 95% confidence limit of 5.2 Gy. This was confirmed on bootstrap analysis. Varying parameters to extreme values still yielded best-fit a/ss of >30 Gy, although the lower 95% confidence interval limit was reduced to 0.6 Gy. Conclusions: Using carefully matched groups, long follow-up, the Phoenix definition of biochemical failure, and well-established statistical methods, the best estimate of a/ss for low and low-tier intermediate-risk prostate cancer is likely to be higher than that of normal tissues, although a low value cannot be excluded.« less
NASA Technical Reports Server (NTRS)
Moas, Eduardo; Boitnott, Richard L.; Griffin, O. Hayden, Jr.
1994-01-01
Six-foot diameter, semicircular graphite/epoxy specimens representative of generic aircraft frames were loaded quasi-statistically to determine their load response and failure mechanisms for large deflections that occur in airplanes crashes. These frame/skin specimens consisted of a cylindrical skin section co-cured with a semicircular I-frame. The skin provided the necessary lateral stiffness to keep deformations in the plane of the frame in order to realistically represent deformations as they occur in actual fuselage structures. Various frame laminate stacking sequences and geometries were evaluated by statically loading the specimen until multiple failures occurred. Two analytical methods were compared for modeling the frame/skin specimens: a two-dimensional shell finite element analysis and a one-dimensional, closed-form, curved beam solution derived using an energy method. Flange effectivities were included in the beam analysis to account for the curling phenomenon that occurs in thin flanges of curved beams. Good correlation was obtained between experimental results and the analytical predictions of the linear response of the frames prior to the initial failure. The specimens were found to be useful for evaluating composite frame designs.
Cycles till failure of silver-zinc cells with competing failure modes - Preliminary data analysis
NASA Technical Reports Server (NTRS)
Sidik, S. M.; Leibecki, H. F.; Bozek, J. M.
1980-01-01
The data analysis of cycles to failure of silver-zinc electrochemical cells with competing failure modes is presented. The test ran 129 cells through charge-discharge cycles until failure; preliminary data analysis consisted of response surface estimate of life. Batteries fail through low voltage condition and an internal shorting condition; a competing failure modes analysis was made using maximum likelihood estimation for the extreme value life distribution. Extensive residual plotting and probability plotting were used to verify data quality and selection of model.
NASA Astrophysics Data System (ADS)
Suffo, M.
2017-08-01
In this work, we present the real case of an industrial product was placed prematurely on the market without having checked the different stages of its life cycle. This type of products must be validated by numerical methods and by mechanical tests to verify their rheological behavior. In particular, the product consists of two small pieces in contact, one made of HDPE and the other one corresponding to a stainless steel. The polymeric piece supports the metal pressure under a constant static load over time. As a result of normal operation, the polymer experienced a “crazing” breakdown, which caused the failure to occur. In the study, design methods and computer assisted analysis software (CAED) have been used. These methods were complemented by scanning electron microscopy that confirmed the initial failure hypothesis. Using the finite element method (FEM), a series of load scenarios were carried out, where the different load hypothesis the product must go through prior to its placing on the market were simulated. It is shown that the failure was initiated by stress concentration on one of the edges of the polymeric piece. The proposed solution of the problem based on the analysis focuses on a simple redesign of the piece, which should have been round, or to the reduction of the thickness of the metal piece. As a result of the alteration of its natural life cycle, the company assumed both monetary costs and the definitive loss of customer confidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
Motivations and Predictors of Cheating in Pharmacy School
Nguyen, Kathy; Shah, Bijal M.; Doroudgar, Shadi; Bidwal, Monica K.
2016-01-01
Objective. To assess the prevalence, methods, and motivations for didactic cheating among pharmacy students and to determine predictive factors for cheating in pharmacy colleges and schools. Methods. A 45-item cross-sectional survey was conducted at all four doctor of pharmacy programs in Northern California. For data analysis, t test, Fisher exact test, and logistic regression were used. Results. Overall, 11.8% of students admitted to cheating in pharmacy school. Primary motivations for cheating included fear of failure, procrastination, and stress. In multivariate analysis, the only predictor for cheating in pharmacy school was a history of cheating in undergraduate studies. Conclusion. Cheating occurs in pharmacy schools and is motivated by fear of failure, procrastination, and stress. A history of past cheating predicts pharmacy school cheating. The information presented may help programs better understand their student population and lead to a reassessment of ethical culture, testing procedures, and prevention programs. PMID:27899829
SU-E-T-495: Neutron Induced Electronics Failure Rate Analysis for a Single Room Proton Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knutson, N; DeWees, T; Klein, E
2014-06-01
Purpose: To determine the failure rate as a function of neutron dose of the range modulator's servo motor controller system (SMCS) while shielded with Borated Polyethylene (BPE) and unshielded in a single room proton accelerator. Methods: Two experimental setups were constructed using two servo motor controllers and two motors. Each SMCS was then placed 30 cm from the end of the plugged proton accelerator applicator. The motor was then turned on and observed from outside of the vault while being irradiated to known neutron doses determined from bubble detector measurements. Anytime the motor deviated from the programmed motion a failuremore » was recorded along with the delivered dose. The experiment was repeated using 9 cm of BPE shielding surrounding the SMCS. Results: Ten SMCS failures were recorded in each experiment. The dose per monitor unit for the unshielded SMCS was 0.0211 mSv/MU and 0.0144 mSv/MU for the shielded SMCS. The mean dose to produce a failure for the unshielded SMCS was 63.5 ± 58.3 mSv versus 17.0 ±12.2 mSv for the shielded. The mean number of MUs between failures were 2297 ± 1891 MU for the unshielded SMCS and 2122 ± 1523 MU for the shielded. A Wilcoxon Signed Ranked test showed the dose between failures were significantly different (P value = 0.044) while the number of MUs between failures were not (P value = 1.000). Statistical analysis determined a SMCS neutron dose of 5.3 mSv produces a 5% chance of failure. Depending on the workload and location of the SMCS, this failure rate could impede clinical workflow. Conclusion: BPE shielding was shown to not reduce the average failure of the SMCS and relocation of the system outside of the accelerator vault was required to lower the failure rate enough to avoid impeding clinical work flow.« less
Analysing malaria drug trials on a per-individual or per-clone basis: a comparison of methods.
Jaki, Thomas; Parry, Alice; Winter, Katherine; Hastings, Ian
2013-07-30
There are a variety of methods used to estimate the effectiveness of antimalarial drugs in clinical trials, invariably on a per-person basis. A person, however, may have more than one malaria infection present at the time of treatment. We evaluate currently used methods for analysing malaria trials on a per-individual basis and introduce a novel method to estimate the cure rate on a per-infection (clone) basis. We used simulated and real data to highlight the differences of the various methods. We give special attention to classifying outcomes as cured, recrudescent (infections that never fully cleared) or ambiguous on the basis of genetic markers at three loci. To estimate cure rates on a per-clone basis, we used the genetic information within an individual before treatment to determine the number of clones present. We used the genetic information obtained at the time of treatment failure to classify clones as recrudescence or new infections. On the per-individual level, we find that the most accurate methods of classification label an individual as newly infected if all alleles are different at the beginning and at the time of failure and as a recrudescence if all or some alleles were the same. The most appropriate analysis method is survival analysis or alternatively for complete data/per-protocol analysis a proportion estimate that treats new infections as successes. We show that the analysis of drug effectiveness on a per-clone basis estimates the cure rate accurately and allows more detailed evaluation of the performance of the treatment. Copyright © 2012 John Wiley & Sons, Ltd.
Chen, Yikai; Wang, Kai; Xu, Chengcheng; Shi, Qin; He, Jie; Li, Peiqing; Shi, Ting
2018-05-19
To overcome the limitations of previous highway alignment safety evaluation methods, this article presents a highway alignment safety evaluation method based on fault tree analysis (FTA) and the characteristics of vehicle safety boundaries, within the framework of dynamic modeling of the driver-vehicle-road system. Approaches for categorizing the vehicle failure modes while driving on highways and the corresponding safety boundaries were comprehensively investigated based on vehicle system dynamics theory. Then, an overall crash probability model was formulated based on FTA considering the risks of 3 failure modes: losing steering capability, losing track-holding capability, and rear-end collision. The proposed method was implemented on a highway segment between Bengbu and Nanjing in China. A driver-vehicle-road multibody dynamics model was developed based on the 3D alignments of the Bengbu to Nanjing section of Ning-Luo expressway using Carsim, and the dynamics indices, such as sideslip angle and, yaw rate were obtained. Then, the average crash probability of each road section was calculated with a fixed-length method. Finally, the average crash probability was validated against the crash frequency per kilometer to demonstrate the accuracy of the proposed method. The results of the regression analysis and correlation analysis indicated good consistency between the results of the safety evaluation and the crash data and that it outperformed the safety evaluation methods used in previous studies. The proposed method has the potential to be used in practical engineering applications to identify crash-prone locations and alignment deficiencies on highways in the planning and design phases, as well as those in service.
Tiede, Michel; Dwinger, Sarah; Herbarth, Lutz; Härter, Martin; Dirmaier, Jörg
2017-09-01
Introduction The * Equal contributors. health-status of heart failure patients can be improved to some extent by disease self-management. One method of developing such skills is telephone-based health coaching. However, the effects of telephone-based health coaching remain inconclusive. The aim of this study was to evaluate the effects of telephone-based health coaching for people with heart failure. Methods A total sample of 7186 patients with various chronic diseases was randomly assigned to either the coaching or the control group. Then 184 patients with heart failure were selected by International Classification of Diseases (ICD)-10 code for subgroup analysis. Data were collected at 24 and 48 months after the beginning of the coaching. The primary outcome was change in quality of life. Secondary outcomes were changes in depression and anxiety, health-related control beliefs, control preference, health risk behaviour and health-related behaviours. Statistical analyses included a per-protocol evaluation, employing analysis of variance and analysis of covariance (ANCOVA) as well as Mann-Whitney U tests. Results Participants' average age was 73 years (standard deviation (SD) = 9) and the majority were women (52.8%). In ANCOVA analyses there were no significant differences between groups for the change in quality of life (QoL) . However, the coaching group reported a significantly higher level of physical activity ( p = 0.03), lower intake of non-prescribed drugs ( p = 0.04) and lower levels of stress ( p = 0.02) than the control group. Mann-Whitney U tests showed a different external locus of control ( p = 0.014), and higher reduction in unhealthy nutrition ( p = 0.019), physical inactivity ( p = 0.004) and stress ( p = 0.028). Discussion Our results suggest that telephone-based health coaching has no effect on QoL, anxiety and depression of heart failure patients, but helps in improving certain risk behaviours and changes the locus of control to be more externalised.
Minimizing treatment planning errors in proton therapy using failure mode and effects analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Yuanshui, E-mail: yuanshui.zheng@okc.procure.com; Johnson, Randall; Larson, Gary
Purpose: Failure mode and effects analysis (FMEA) is a widely used tool to evaluate safety or reliability in conventional photon radiation therapy. However, reports about FMEA application in proton therapy are scarce. The purpose of this study is to apply FMEA in safety improvement of proton treatment planning at their center. Methods: The authors performed an FMEA analysis of their proton therapy treatment planning process using uniform scanning proton beams. The authors identified possible failure modes in various planning processes, including image fusion, contouring, beam arrangement, dose calculation, plan export, documents, billing, and so on. For each error, the authorsmore » estimated the frequency of occurrence, the likelihood of being undetected, and the severity of the error if it went undetected and calculated the risk priority number (RPN). The FMEA results were used to design their quality management program. In addition, the authors created a database to track the identified dosimetric errors. Periodically, the authors reevaluated the risk of errors by reviewing the internal error database and improved their quality assurance program as needed. Results: In total, the authors identified over 36 possible treatment planning related failure modes and estimated the associated occurrence, detectability, and severity to calculate the overall risk priority number. Based on the FMEA, the authors implemented various safety improvement procedures into their practice, such as education, peer review, and automatic check tools. The ongoing error tracking database provided realistic data on the frequency of occurrence with which to reevaluate the RPNs for various failure modes. Conclusions: The FMEA technique provides a systematic method for identifying and evaluating potential errors in proton treatment planning before they result in an error in patient dose delivery. The application of FMEA framework and the implementation of an ongoing error tracking system at their clinic have proven to be useful in error reduction in proton treatment planning, thus improving the effectiveness and safety of proton therapy.« less
Probabilistic finite elements for fatigue and fracture analysis
NASA Astrophysics Data System (ADS)
Belytschko, Ted; Liu, Wing Kam
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
Probabilistic finite elements for fatigue and fracture analysis
NASA Technical Reports Server (NTRS)
Belytschko, Ted; Liu, Wing Kam
1992-01-01
Attenuation is focused on the development of Probabilistic Finite Element Method (PFEM), which combines the finite element method with statistics and reliability methods, and its application to linear, nonlinear structural mechanics problems and fracture mechanics problems. The computational tool based on the Stochastic Boundary Element Method is also given for the reliability analysis of a curvilinear fatigue crack growth. The existing PFEM's have been applied to solve for two types of problems: (1) determination of the response uncertainty in terms of the means, variance and correlation coefficients; and (2) determination the probability of failure associated with prescribed limit states.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Gallart, X; Gomez, J C; Fernández-Valencia, J A; Combalía, A; Bori, G; García, S; Rios, J; Riba, J
2014-01-01
To evaluate the short-term results of an ultra high molecular weight polyethylene retentive cup in patients at high risk of dislocation, either primary or revision surgery. Retrospective review of 38 cases in order to determine the rate of survival and failure analysis of a constrained cemented cup, with a mean follow-up of 27 months. We studied demographic data, complications, especially re-dislocations of the prosthesis and, also the likely causes of system failure analyzed. In 21.05% (8 cases) were primary surgery and 78.95% were revision surgery (30 cases). The overall survival rate by Kaplan-Meier method was 70.7 months. During follow-up 3 patients died due to causes unrelated to surgery and 2 infections occurred. 12 hips had at least two previous surgeries done. It wasn't any case of aseptic loosening. Four patients presented dislocation, all with a 22 mm head (P=.008). Our statistical analysis didn't found relationship between the abduction cup angle and implant failure (P=.22). The ultra high molecular weight polyethylene retentive cup evaluated in this series has provided satisfactory short-term results in hip arthroplasty patients at high risk of dislocation. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.
Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data
Zhao, Shanshan
2014-01-01
Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469
Covariate measurement error correction methods in mediation analysis with failure time data.
Zhao, Shanshan; Prentice, Ross L
2014-12-01
Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.
Failure Analysis of Discrete Damaged Tailored Extension-Shear-Coupled Stiffened Composite Panels
NASA Technical Reports Server (NTRS)
Baker, Donald J.
2005-01-01
The results of an analytical and experimental investigation of the failure of composite is tiffener panels with extension-shear coupling are presented. This tailored concept, when used in the cover skins of a tiltrotor aircraft wing has the potential for increasing the aeroelastic stability margins and improving the aircraft productivity. The extension-shear coupling is achieved by using unbalanced 45 plies in the skin. The failure analysis of two tailored panel configurations that have the center stringer and adjacent skin severed is presented. Finite element analysis of the damaged panels was conducted using STAGS (STructural Analysis of General Shells) general purpose finite element program that includes a progressive failure capability for laminated composite structures that is based on point-stress analysis, traditional failure criteria, and ply discounting for material degradation. The progressive failure predicted the path of the failure and maximum load capability. There is less than 12 percent difference between the predicted failure load and experimental failure load. There is a good match of the panel stiffness and strength between the progressive failure analysis and the experimental results. The results indicate that the tailored concept would be feasible to use in the wing skin of a tiltrotor aircraft.
Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
2017-12-20
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
Comparing Methods for Assessing Reliability Uncertainty Based on Pass/Fail Data Collected Over Time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abes, Jeff I.; Hamada, Michael S.; Hills, Charles R.
In this paper, we compare statistical methods for analyzing pass/fail data collected over time; some methods are traditional and one (the RADAR or Rationale for Assessing Degradation Arriving at Random) was recently developed. These methods are used to provide uncertainty bounds on reliability. We make observations about the methods' assumptions and properties. Finally, we illustrate the differences between two traditional methods, logistic regression and Weibull failure time analysis, and the RADAR method using a numerical example.
ElHoussiney, Amr G; Zhang, He; Song, Jinlin; Ji, Ping; Wang, Lu; Yang, Sheng
2018-01-01
Purpose To compare the failure events and incidence of complications of different abutment materials in anterior and posterior regions. Failure was defined as complete loss of the abutment requiring replacement by a new abutment. Materials and methods Electronic searches using PubMed/Medline and Google Scholar complemented with manual searches were performed with specific search terms. Searches were restricted to publications in English between January 2006 and March 2016. Results A total of 863 and 1,264 implants were inserted in the anterior and posterior regions, respectively, in a total of 1,529 patients. No titanium abutments failed in anterior or posterior regions. On the other hand, 1.6% of zirconia abutments failed in the anterior region and 1.5% failed in the posterior region. Technical complications occurred mostly in the posterior region and mostly involved zirconia abutment. Meta-analysis was possible only for zirconia-abutment failure, due to considerable heterogeneity of studies and outcome variables. No significant difference in failure rate was found between anterior and posterior zirconia abutments (risk ratio 1.53, 95% CI 0.49–4.77; P=0.47). Conclusion This systematic review and meta-analysis showed similar outcomes of different abutment materials when used in anterior and posterior regions in terms of failure events and biological and aesthetic complications. The only significant finding was the increased incidence of technical complications in the posterior region, mostly involving zirconia abutments. Abutment-screw loosening was the most common technical complication. PMID:29520162
Secure Embedded System Design Methodologies for Military Cryptographic Systems
2016-03-31
Fault- Tree Analysis (FTA); Built-In Self-Test (BIST) Introduction Secure access-control systems restrict operations to authorized users via methods...failures in the individual software/processor elements, the question of exactly how unlikely is difficult to answer. Fault- Tree Analysis (FTA) has a...Collins of Sandia National Laboratories for years of sharing his extensive knowledge of Fail-Safe Design Assurance and Fault- Tree Analysis
Thalanany, Mariamma M; Mugford, Miranda; Hibbert, Clare; Cooper, Nicola J; Truesdale, Ann; Robinson, Steven; Tiruvoipati, Ravindranath; Elbourne, Diana R; Peek, Giles J; Clemens, Felicity; Hardy, Polly; Wilson, Andrew
2008-01-01
Background Extracorporeal Membrane Oxygenation (ECMO) is a technology used in treatment of patients with severe but potentially reversible respiratory failure. A multi-centre randomised controlled trial (CESAR) was funded in the UK to compare care including ECMO with conventional intensive care management. The protocol and funding for the CESAR trial included plans for economic data collection and analysis. Given the high cost of treatment, ECMO is considered an expensive technology for many funding systems. However, conventional treatment for severe respiratory failure is also one of the more costly forms of care in any health system. Methods/Design The objectives of the economic evaluation are to compare the costs of a policy of referral for ECMO with those of conventional treatment; to assess cost-effectiveness and the cost-utility at 6 months follow-up; and to assess the cost-utility over a predicted lifetime. Resources used by patients in the trial are identified. Resource use data are collected from clinical report forms and through follow up interviews with patients. Unit costs of hospital intensive care resources are based on parallel research on cost functions in UK NHS intensive care units. Other unit costs are based on published NHS tariffs. Cost effectiveness analysis uses the outcome: survival without severe disability. Cost utility analysis is based on quality adjusted life years gained based on the Euroqol EQ-5D at 6 months. Sensitivity analysis is planned to vary assumptions about transport costs and method of costing intensive care. Uncertainty will also be expressed in analysis of individual patient data. Probabilities of cost effectiveness given different funding thresholds will be estimated. Discussion In our view it is important to record our methods in detail and present them before publication of the results of the trial so that a record of detail not normally found in the final trial reports can be made available in the public domain. Trial Registrations The CESAR trial registration number is ISRCTN47279827. PMID:18447931
Rosen, M. A.; Sampson, J. B.; Jackson, E. V.; Koka, R.; Chima, A. M.; Ogbuagu, O. U.; Marx, M. K.; Koroma, M.; Lee, B. H.
2014-01-01
Background Anaesthesia care in developed countries involves sophisticated technology and experienced providers. However, advanced machines may be inoperable or fail frequently when placed into the austere medical environment of a developing country. Failure mode and effects analysis (FMEA) is a method for engaging local staff in identifying real or potential breakdowns in processes or work systems and to develop strategies to mitigate risks. Methods Nurse anaesthetists from the two tertiary care hospitals in Freetown, Sierra Leone, participated in three sessions moderated by a human factors specialist and an anaesthesiologist. Sessions were audio recorded, and group discussion graphically mapped by the session facilitator for analysis and commentary. These sessions sought to identify potential barriers to implementing an anaesthesia machine designed for austere medical environments—the universal anaesthesia machine (UAM)—and also engaging local nurse anaesthetists in identifying potential solutions to these barriers. Results Participating Sierra Leonean clinicians identified five main categories of failure modes (resource availability, environmental issues, staff knowledge and attitudes, and workload and staffing issues) and four categories of mitigation strategies (resource management plans, engaging and educating stakeholders, peer support for new machine use, and collectively advocating for needed resources). Conclusions We identified factors that may limit the impact of a UAM and devised likely effective strategies for mitigating those risks. PMID:24833727
Le, Laetitia Minh Mai; Reitter, Delphine; He, Sophie; Bonle, Franck Té; Launois, Amélie; Martinez, Diane; Prognon, Patrice; Caudron, Eric
2017-12-01
Handling cytotoxic drugs is associated with chemical contamination of workplace surfaces. The potential mutagenic, teratogenic and oncogenic properties of those drugs create a risk of occupational exposure for healthcare workers, from reception of starting materials to the preparation and administration of cytotoxic therapies. The Security Failure Mode Effects and Criticality Analysis (FMECA) was used as a proactive method to assess the risks involved in the chemotherapy compounding process. FMECA was carried out by a multidisciplinary team from 2011 to 2016. Potential failure modes of the process were identified based on the Risk Priority Number (RPN) that prioritizes corrective actions. Twenty-five potential failure modes were identified. Based on RPN results, the corrective actions plan was revised annually to reduce the risk of exposure and improve practices. Since 2011, 16 specific measures were implemented successively. In six years, a cumulative RPN reduction of 626 was observed, with a decrease from 912 to 286 (-69%) despite an increase of cytotoxic compounding activity of around 23.2%. In order to anticipate and prevent occupational exposure, FMECA is a valuable tool to identify, prioritize and eliminate potential failure modes for operators involved in the cytotoxic drug preparation process before the failures occur. Copyright © 2017 Elsevier B.V. All rights reserved.
Capturing strain localization behind a geosynthetic-reinforced soil wall
NASA Astrophysics Data System (ADS)
Lai, Timothy Y.; Borja, Ronaldo I.; Duvernay, Blaise G.; Meehan, Richard L.
2003-04-01
This paper presents the results of finite element (FE) analyses of shear strain localization that occurred in cohesionless soils supported by a geosynthetic-reinforced retaining wall. The innovative aspects of the analyses include capturing of the localized deformation and the accompanying collapse mechanism using a recently developed embedded strong discontinuity model. The case study analysed, reported in previous publications, consists of a 3.5-m tall, full-scale reinforced wall model deforming in plane strain and loaded by surcharge at the surface to failure. Results of the analysis suggest strain localization developing from the toe of the wall and propagating upward to the ground surface, forming a curved failure surface. This is in agreement with a well-documented failure mechanism experienced by the physical wall model showing internal failure surfaces developing behind the wall as a result of the surface loading. Important features of the analyses include mesh sensitivity studies and a comparison of the localization properties predicted by different pre-localization constitutive models, including a family of three-invariant elastoplastic constitutive models appropriate for frictional/dilatant materials. Results of the analysis demonstrate the potential of the enhanced FE method for capturing a collapse mechanism characterized by the presence of a failure, or slip, surface through earthen materials.
Reliability evaluation methodology for NASA applications
NASA Technical Reports Server (NTRS)
Taneja, Vidya S.
1992-01-01
Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability in general has not been considered as one of the system parameters like cost or performance. Up till now, quantification of reliability has not been a consideration during system design and development in the liquid rocket industry. Engineers and managers have long been aware of the fact that the reliability of the system increases during development, but no serious attempts have been made to quantify reliability. As a result, a method to quantify reliability during design and development is needed. This includes application of probabilistic models which utilize both engineering analysis and test data. Classical methods require the use of operating data for reliability demonstration. In contrast, the method described in this paper is based on similarity, analysis, and testing combined with Bayesian statistical analysis.
Characterization of the Failure Site Distribution in MIM Devices Using Zoomed Wavelet Analysis
NASA Astrophysics Data System (ADS)
Muñoz-Gorriz, J.; Monaghan, S.; Cherkaoui, K.; Suñé, J.; Hurley, P. K.; Miranda, E.
2018-05-01
The angular wavelet analysis is applied to the study of the spatial distribution of breakdown (BD) spots in Pt/HfO2/Pt capacitors with square and circular areas. The method is originally developed for rectangular areas, so a zoomed approach needs to be considered when the observation window does not coincide with the device area. The BD spots appear as a consequence of the application of electrical stress to the device. The stress generates defects within the dielectric film, a process that ends with the formation of a percolation path between the electrodes and the melting of the top metal layer because of the high release of energy. The BD spots have lateral sizes ranging from 1 μm to 3 μm and they appear as a point pattern that can be studied using spatial statistics methods. In this paper, we report the application of the angular wavelet method as a complementary tool for the analysis of the distribution of failure sites in large-area metal-insulator-metal (MIM) devices. The differences between considering a continuous or a discrete wavelet and the role played by the number of BD spots are also investigated.
Janssen, Wouter; Johansen, Monika Alise
2018-01-01
Background eHealth has an enormous potential to improve healthcare cost, effectiveness, and quality of care. However, there seems to be a gap between the foreseen benefits of research and clinical reality. Objective Our objective was to systematically review the factors influencing the outcome of eHealth interventions in terms of success and failure. Methods We searched the PubMed database for original peer-reviewed studies on implemented eHealth tools that reported on the factors for the success or failure, or both, of the intervention. We conducted the systematic review by following the patient, intervention, comparison, and outcome framework, with 2 of the authors independently reviewing the abstract and full text of the articles. We collected data using standardized forms that reflected the categorization model used in the qualitative analysis of the outcomes reported in the included articles. Results Among the 903 identified articles, a total of 221 studies complied with the inclusion criteria. The studies were heterogeneous by country, type of eHealth intervention, method of implementation, and reporting perspectives. The article frequency analysis did not show a significant discrepancy between the number of reports on failure (392/844, 46.5%) and on success (452/844, 53.6%). The qualitative analysis identified 27 categories that represented the factors for success or failure of eHealth interventions. A quantitative analysis of the results revealed the category quality of healthcare (n=55) as the most mentioned as contributing to the success of eHealth interventions, and the category costs (n=42) as the most mentioned as contributing to failure. For the category with the highest unique article frequency, workflow (n=51), we conducted a full-text review. The analysis of the 23 articles that met the inclusion criteria identified 6 barriers related to workflow: workload (n=12), role definition (n=7), undermining of face-to-face communication (n=6), workflow disruption (n=6), alignment with clinical processes (n=2), and staff turnover (n=1). Conclusions The reviewed literature suggested that, to increase the likelihood of success of eHealth interventions, future research must ensure a positive impact in the quality of care, with particular attention given to improved diagnosis, clinical management, and patient-centered care. There is a critical need to perform in-depth studies of the workflow(s) that the intervention will support and to perceive the clinical processes involved. PMID:29716883
An unjustified benefit: immortal time bias in the analysis of time-dependent events.
Gleiss, Andreas; Oberbauer, Rainer; Heinze, Georg
2018-02-01
Immortal time bias is a problem arising from methodologically wrong analyses of time-dependent events in survival analyses. We illustrate the problem by analysis of a kidney transplantation study. Following patients from transplantation to death, groups defined by the occurrence or nonoccurrence of graft failure during follow-up seemingly had equal overall mortality. Such naive analysis assumes that patients were assigned to the two groups at time of transplantation, which actually are a consequence of occurrence of a time-dependent event later during follow-up. We introduce landmark analysis as the method of choice to avoid immortal time bias. Landmark analysis splits the follow-up time at a common, prespecified time point, the so-called landmark. Groups are then defined by time-dependent events having occurred before the landmark, and outcome events are only considered if occurring after the landmark. Landmark analysis can be easily implemented with common statistical software. In our kidney transplantation example, landmark analyses with landmarks set at 30 and 60 months clearly identified graft failure as a risk factor for overall mortality. We give further typical examples from transplantation research and discuss strengths and limitations of landmark analysis and other methods to address immortal time bias such as Cox regression with time-dependent covariables. © 2017 Steunstichting ESOT.
Nondestructive SEM for surface and subsurface wafer imaging
NASA Technical Reports Server (NTRS)
Propst, Roy H.; Bagnell, C. Robert; Cole, Edward I., Jr.; Davies, Brian G.; Dibianca, Frank A.; Johnson, Darryl G.; Oxford, William V.; Smith, Craig A.
1987-01-01
The scanning electron microscope (SEM) is considered as a tool for both failure analysis as well as device characterization. A survey is made of various operational SEM modes and their applicability to image processing methods on semiconductor devices.
How Analysis Informs Regulation:Success and Failure of ...
How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.
NASA Technical Reports Server (NTRS)
Mcconnaughey, P. K.; Garcia, R.; Dejong, F. J.; Sabnis, J. S.; Pribik, D. A.
1989-01-01
An analysis of Space Shuttle Main Engine high-pressure oxygen turbopump nozzle plug trajectories has been performed, using a Lagrangian method to track nozzle plug particles expelled from a turbine through a high Reynolds number flow in a turnaround duct with turning vanes. Axisymmetric and parametric analyses reveal that if nozzle plugs exited the turbine they would probably impact the LOX heat exchanger with impact velocities which are significantly less than the penetration velocity. The finding that only slight to moderate damage will result from nozzle plug failure in flight is supported by the results of a hot-fire engine test with induced nozzle plug failures.
Methods, apparatus and system for notification of predictable memory failure
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2017-01-03
A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.
ERIC Educational Resources Information Center
Simon, Charles W.
An "undesigned" experiment is one in which the predictor variables are correlated, either due to a failure to complete a design or because the investigator was unable to select or control relevant experimental conditions. The traditional method of analyzing this class of experiment--multiple regression analysis based on a least squares…
White Paper: A Defect Prioritization Method Based on the Risk Priority Number
2013-11-01
adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories
NASA Technical Reports Server (NTRS)
Yew, Calinda; Stephens, Matt
2015-01-01
The JWST IEC conformal shields are mounted onto a composite frame structure that must undergo qualification testing to satisfy mission assurance requirements. The composite frame segments are bonded together at the joints using epoxy, EA 9394. The development of a test method to verify the integrity of the bonded structure at its operating environment introduces challenges in terms of requirements definition and the attainment of success criteria. Even though protoflight thermal requirements were not achieved, the first attempt in exposing the structure to cryogenic operating conditions in a thermal vacuum environment resulted in approximately 1 bonded joints failure during mechanical pull tests performed at 1.25 times the flight loads. Failure analysis concluded that the failure mode was due to adhesive cracks that formed and propagated along stress concentrated fillets as a result of poor bond squeeze-out control during fabrication. Bond repairs were made and the structures successfully re-tested with an improved LN2 immersion test method to achieve protoflight thermal requirements.
A Bayesian Framework for Human Body Pose Tracking from Depth Image Sequences
Zhu, Youding; Fujimura, Kikuo
2010-01-01
This paper addresses the problem of accurate and robust tracking of 3D human body pose from depth image sequences. Recovering the large number of degrees of freedom in human body movements from a depth image sequence is challenging due to the need to resolve the depth ambiguity caused by self-occlusions and the difficulty to recover from tracking failure. Human body poses could be estimated through model fitting using dense correspondences between depth data and an articulated human model (local optimization method). Although it usually achieves a high accuracy due to dense correspondences, it may fail to recover from tracking failure. Alternately, human pose may be reconstructed by detecting and tracking human body anatomical landmarks (key-points) based on low-level depth image analysis. While this method (key-point based method) is robust and recovers from tracking failure, its pose estimation accuracy depends solely on image-based localization accuracy of key-points. To address these limitations, we present a flexible Bayesian framework for integrating pose estimation results obtained by methods based on key-points and local optimization. Experimental results are shown and performance comparison is presented to demonstrate the effectiveness of the proposed approach. PMID:22399933
[Progressive damage monitoring of corrugated composite skins by the FBG spectral characteristics].
Zhang, Yong; Wang, Bang-Feng; Lu, Ji-Yun; Gu, Li-Li; Su, Yong-Gang
2014-03-01
In the present paper, a method of monitoring progressive damage of composite structures by non-uniform fiber Bragg grating (FBG) reflection spectrum is proposed. Due to the finite element analysis of corrugated composite skins specimens, the failure process under tensile load and corresponding critical failure loads of corrugated composite skin was predicated. Then, the non-uniform reflection spectrum of FBG sensor could be reconstructed and the corresponding relationship between layer failure order sequence of corrugated composite skin and FBG sensor reflection spectrums was acquired. A monitoring system based on FBG non-uniform reflection spectrum, which can be used to monitor progressive damage of corrugated composite skins, was built. The corrugated composite skins were stretched under this FBG non-uniform reflection spectrum monitoring system. The results indicate that real-time spectrums acquired by FBG non-uniform reflection spectrum monitoring system show the same trend with the reconstruction reflection spectrums. The maximum error between the corresponding failure and the predictive value is 8.6%, which proves the feasibility of using FBG sensor to monitor progressive damage of corrugated composite skin. In this method, the real-time changes in the FBG non-uniform reflection spectrum within the scope of failure were acquired through the way of monitoring and predicating, and at the same time, the progressive damage extent and layer failure sequence of corru- gated composite skin was estimated, and without destroying the structure of the specimen, the method is easy and simple to operate. The measurement and transmission section of the system are completely composed of optical fiber, which provides new ideas and experimental reference for the field of dynamic monitoring of smart skin.
Kato, Yuko; Suzuki, Shinya; Uejima, Tokuhisa; Semba, Hiroaki; Nagayama, Osamu; Hayama, Etsuko; Arita, Takuto; Yagi, Naoharu; Kano, Hiroto; Matsuno, Shunsuke; Otsuka, Takayuki; Oikawa, Yuji; Kunihara, Takashi; Yajima, Junji; Yamashita, Takeshi
2018-05-01
Background Ventilatory efficiency decreases with age. This study aimed to investigate the prognostic significance and cut-off value of the minute ventilation/carbon dioxide production (VE/VCO 2 ) slope according to age in patients with heart failure. Methods and results We analysed 1501 patients with heart failure from our observational cohort who performed maximal symptom-limited cardiopulmonary exercise testing and separated them into three age groups (≤55 years, 56-70 years and ≥71 years) in total and according to the three ejection fraction categories defined by European Society of Cardiology guidelines. The endpoint was set as heart failure events, hospitalisation for heart failure or death from heart failure. The VE/VCO 2 slope increased with age. During the median follow-up period of 4 years, 141 heart failure (9%) events occurred. In total, univariate Cox analyses showed that the VE/VCO 2 slope (cont.) was significantly related to heart failure events, while on multivariate analysis, the prognostic significance of the VE/VCO 2 slope (cont.) was poor, accompanied by a significant interaction with age ( P < 0.0001). The cut-off value of the VE/VCO 2 slope increased with the increase in age in not only the total but also the sub-ejection fraction categories. Multivariate analyses with a stepwise method adjusted for estimated glomerular filtration rate, peak oxygen consumption, atrial fibrillation and brain natriuretic peptide, showed that the predictive value of the binary VE/VCO 2 slope separated by the cut-off value varied according to age. There was a tendency for the prognostic significance to increase with age irrespective of ejection fraction. Conclusion The prognostic significance and cut-off value of the VE/VCO 2 slope may increase with advancing age.
ERIC Educational Resources Information Center
Marsh, Herbert W.; Hocevar, Dennis
The advantages of applying confirmatory factor analysis (CFA) to multitrait-multimethod (MTMM) data are widely recognized. However, because CFA as traditionally applied to MTMM data incorporates single indicators of each scale (i.e., each trait/method combination), important weaknesses are the failure to: (1) correct appropriately for measurement…
Hao, Shengwang; Liu, Chao; Lu, Chunsheng; Elsworth, Derek
2016-01-01
A theoretical explanation of a time-to-failure relation is presented, with this relationship then used to describe the failure of materials. This provides the potential to predict timing (tf − t) immediately before failure by extrapolating the trajectory as it asymptotes to zero with no need to fit unknown exponents as previously proposed in critical power law behaviors. This generalized relation is verified by comparison with approaches to criticality for volcanic eruptions and creep failure. A new relation based on changes with stress is proposed as an alternative expression of Voight’s relation, which is widely used to describe the accelerating precursory signals before material failure and broadly applied to volcanic eruptions, landslides and other phenomena. The new generalized relation reduces to Voight’s relation if stress is limited to increase at a constant rate with time. This implies that the time-derivatives in Voight’s analysis may be a subset of a more general expression connecting stress derivatives, and thus provides a potential method for forecasting these events. PMID:27306851
The Stability Analysis Method of the Cohesive Granular Slope on the Basis of Graph Theory.
Guan, Yanpeng; Liu, Xiaoli; Wang, Enzhi; Wang, Sijing
2017-02-27
This paper attempted to provide a method to calculate progressive failure of the cohesivefrictional granular geomaterial and the spatial distribution of the stability of the cohesive granular slope. The methodology can be divided into two parts: the characterization method of macro-contact and the analysis of the slope stability. Based on the graph theory, the vertexes, the edges and the edge sequences are abstracted out to characterize the voids, the particle contact and the macro-contact, respectively, bridging the gap between the mesoscopic and macro scales of granular materials. This paper adopts this characterization method to extract a graph from a granular slope and characterize the macro sliding surface, then the weighted graph is analyzed to calculate the slope safety factor. Each edge has three weights representing the sliding moment, the anti-sliding moment and the braking index of contact-bond, respectively, . The safety factor of the slope is calculated by presupposing a certain number of sliding routes and reducing Weight repeatedly and counting the mesoscopic failure of the edge. It is a kind of slope analysis method from mesoscopic perspective so it can present more detail of the mesoscopic property of the granular slope. In the respect of macro scale, the spatial distribution of the stability of the granular slope is in agreement with the theoretical solution.
Press, Craig A; Morgan, Lindsey; Mills, Michele; Stack, Cynthia V; Goldstein, Joshua L; Alonso, Estella M; Wainwright, Mark S
2017-01-01
Spectral electroencephalogram analysis is a method for automated analysis of electroencephalogram patterns, which can be performed at the bedside. We sought to determine the utility of spectral electroencephalogram for grading hepatic encephalopathy in children with acute liver failure. Retrospective cohort study. Tertiary care pediatric hospital. Patients between 0 and 18 years old who presented with acute liver failure and were admitted to the PICU. None. Electroencephalograms were analyzed by spectral analysis including total power, relative δ, relative θ, relative α, relative β, θ-to-Δ ratio, and α-to-Δ ratio. Normal values and ranges were first derived using normal electroencephalograms from 70 children of 0-18 years old. Age had a significant effect on each variable measured (p < 0.03). Electroencephalograms from 33 patients with acute liver failure were available for spectral analysis. The median age was 4.3 years, 14 of 33 were male, and the majority had an indeterminate etiology of acute liver failure. Neuroimaging was performed in 26 cases and was normal in 20 cases (77%). The majority (64%) survived, and 82% had a good outcome with a score of 1-3 on the Pediatric Glasgow Outcome Scale-Extended at the time of discharge. Hepatic encephalopathy grade correlated with the qualitative visual electroencephalogram scores assigned by blinded neurophysiologists (rs = 0.493; p < 0.006). Spectral electroencephalogram characteristics varied significantly with the qualitative electroencephalogram classification (p < 0.05). Spectral electroencephalogram variables including relative Δ, relative θ, relative α, θ-to-Δ ratio, and α-to-Δ ratio all significantly varied with the qualitative electroencephalogram (p < 0.025). Moderate to severe hepatic encephalopathy was correlated with a total power of less than or equal to 50% of normal for children 0-3 years old, and with a relative θ of less than or equal to 50% normal for children more than 3 years old (p > 0.05). Spectral electroencephalogram classification correlated with outcome (p < 0.05). Spectral electroencephalogram analysis can be used to evaluate even young patients for hepatic encephalopathy and correlates with outcome. Spectral electroencephalogram may allow improved quantitative and reproducible assessment of hepatic encephalopathy grade in children with acute liver failure.
Tenofovir in second-line ART in Zambia and South Africa: Collaborative analysis of cohort studies
Wandeler, Gilles; Keiser, Olivia; Mulenga, Lloyd; Hoffmann, Christopher J; Wood, Robin; Chaweza, Thom; Brennan, Alana; Prozesky, Hans; Garone, Daniela; Giddy, Janet; Chimbetete, Cleophas; Boulle, Andrew; Egger, Matthias
2012-01-01
Objectives Tenofovir (TDF) is increasingly used in second-line antiretroviral treatment (ART) in sub-Saharan Africa. We compared outcomes of second-line ART containing and not containing TDF in cohort studies from Zambia and the Republic of South Africa (RSA). Methods Patients aged ≥ 16 years starting protease inhibitor-based second-line ART in Zambia (1 cohort) and RSA (5 cohorts) were included. We compared mortality, immunological failure (all cohorts) and virological failure (RSA only) between patients receiving and not receiving TDF. Competing risk models and Cox models adjusted for age, sex, CD4 count, time on first-line ART and calendar year were used to analyse mortality and treatment failure, respectively. Hazard ratios (HRs) were combined in fixed-effects meta-analysis. Findings 1,687 patients from Zambia and 1,556 patients from RSA, including 1,350 (80.0%) and 206 (13.2%) patients starting TDF, were followed over 4,471 person-years. Patients on TDF were more likely to have started second-line ART in recent years, and had slightly higher baseline CD4 counts than patients not on TDF. Overall 127 patients died, 532 were lost to follow-up and 240 patients developed immunological failure. In RSA 94 patients had virologic failure. Combined HRs comparing tenofovir with other regimens were 0.60 (95% CI 0.41–0.87) for immunologic failure and 0.63 (0.38–1.05) for mortality. The HR for virologic failure in RSA was 0.28 (0.09–0.90). Conclusions In this observational study patients on TDF-containing second-line ART were less likely to develop treatment failure than patients on other regimens. TDF seems to be an effective component of second-line ART in southern Africa. PMID:22743595
FMEA of manual and automated methods for commissioning a radiotherapy treatment planning system.
Wexler, Amy; Gu, Bruce; Goddu, Sreekrishna; Mutic, Maya; Yaddanapudi, Sridhar; Olsen, Lindsey; Harry, Taylor; Noel, Camille; Pawlicki, Todd; Mutic, Sasa; Cai, Bin
2017-09-01
To evaluate the level of risk involved in treatment planning system (TPS) commissioning using a manual test procedure, and to compare the associated process-based risk to that of an automated commissioning process (ACP) by performing an in-depth failure modes and effects analysis (FMEA). The authors collaborated to determine the potential failure modes of the TPS commissioning process using (a) approaches involving manual data measurement, modeling, and validation tests and (b) an automated process utilizing application programming interface (API) scripting, preloaded, and premodeled standard radiation beam data, digital heterogeneous phantom, and an automated commissioning test suite (ACTS). The severity (S), occurrence (O), and detectability (D) were scored for each failure mode and the risk priority numbers (RPN) were derived based on TG-100 scale. Failure modes were then analyzed and ranked based on RPN. The total number of failure modes, RPN scores and the top 10 failure modes with highest risk were described and cross-compared between the two approaches. RPN reduction analysis is also presented and used as another quantifiable metric to evaluate the proposed approach. The FMEA of a MTP resulted in 47 failure modes with an RPN ave of 161 and S ave of 6.7. The highest risk process of "Measurement Equipment Selection" resulted in an RPN max of 640. The FMEA of an ACP resulted in 36 failure modes with an RPN ave of 73 and S ave of 6.7. The highest risk process of "EPID Calibration" resulted in an RPN max of 576. An FMEA of treatment planning commissioning tests using automation and standardization via API scripting, preloaded, and pre-modeled standard beam data, and digital phantoms suggests that errors and risks may be reduced through the use of an ACP. © 2017 American Association of Physicists in Medicine.
Analysis of progressive damage in thin circular laminates due to static-equivalent impact loads
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Elber, W.; Illg, W.
1983-01-01
Clamped circular graphite/epoxy plates (25.4, 38.1, and 50.8 mm radii) with an 8-ply quasi-isotropic layup were analyzed for static-equivalent impact loads using the minimum-total-potential-energy method and the von Karman strain-displacement equations. A step-by-step incremental transverse displacement procedure was used to calculate plate load and ply stresses. The ply failure region was calculated using the Tsai-Wu criterion. The corresponding failure modes (splitting and fiber failure) were determined using the maximum stress criteria. The first-failure mode was splitting and initiated first in the bottom ply. The splitting-failure thresholds were relatively low and tended to be lower for larger plates than for small plates. The splitting-damage region in each ply was elongated in its fiber direction; the bottom ply had the largest damage region. The calculated damage region for the 25.4-mm-radius plate agreed with limited static test results from the literature.
NASA Astrophysics Data System (ADS)
Park, Jung-Yong; Jung, Yong-Keun; Park, Jong-Jin; Kang, Yong-Ho
2002-05-01
Failures of turbine blades are identified as the leading causes of unplanned outages for steam turbine. Accidents of low-pressure turbine blade occupied more than 70 percent in turbine components. Therefore, the prevention of failures for low pressure turbine blades is certainly needed. The procedure is illustrated by the case study. This procedure is used to guide, and support the plant manager's decisions to avoid a costly, unplanned outage. In this study, we are trying to find factors of failures in LP turbine blade and to make three steps to approach the solution of blade failure. First step is to measure natural frequency in mockup test and to compare it with nozzle passing frequency. Second step is to use FEM and to calculate the natural frequencies of 7 blades and 10 blades per group in BLADE code. Third step is to find natural frequencies of grouped blade off the nozzle passing frequency.
Wang, Yumei; Yin, Xiaoling; Yang, Fang
2018-02-01
Sepsis is an inflammatory-related disease, and severe sepsis would induce multiorgan dysfunction, which is the most common cause of death of patients in noncoronary intensive care units. Progression of novel therapeutic strategies has proven to be of little impact on the mortality of severe sepsis, and unfortunately, its mechanisms still remain poorly understood. In this study, we analyzed gene expression profiles of severe sepsis with failure of lung, kidney, and liver for the identification of potential biomarkers. We first downloaded the gene expression profiles from the Gene Expression Omnibus and performed preprocessing of raw microarray data sets and identification of differential expression genes (DEGs) through the R programming software; then, significantly enriched functions of DEGs in lung, kidney, and liver failure sepsis samples were obtained from the Database for Annotation, Visualization, and Integrated Discovery; finally, protein-protein interaction network was constructed for DEGs based on the STRING database, and network modules were also obtained through the MCODE cluster method. As a result, lung failure sepsis has the highest number of DEGs of 859, whereas the number of DEGs in kidney and liver failure sepsis samples is 178 and 175, respectively. In addition, 17 overlaps were obtained among the three lists of DEGs. Biological processes related to immune and inflammatory response were found to be significantly enriched in DEGs. Network and module analysis identified four gene clusters in which all or most of genes were upregulated. The expression changes of Icam1 and Socs3 were further validated through quantitative PCR analysis. This study should shed light on the development of sepsis and provide potential therapeutic targets for sepsis-induced multiorgan failure.
Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Noori Hekmat, Somayeh; Esmailzdeh, Hamid
2014-12-25
Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts' panel views via the interview and focus group discussion sessions. The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ ("Theory of Inventive Problem Solving.") The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency.
Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Hekmat, Somayeh Noori; Esmailzdeh, Hamid
2015-01-01
Introduction: Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. Methodology: This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts’ panel views via the interview and focus group discussion sessions. Results: The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ (“Theory of Inventive Problem Solving.”) Conclusion: The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency. PMID:25560332
NASA Technical Reports Server (NTRS)
Monaghan, Mark W.; Gillespie, Amanda M.
2013-01-01
During the shuttle era NASA utilized a failure reporting system called the Problem Reporting and Corrective Action (PRACA) it purpose was to identify and track system non-conformance. The PRACA system over the years evolved from a relatively nominal way to identify system problems to a very complex tracking and report generating data base. The PRACA system became the primary method to categorize any and all anomalies from corrosion to catastrophic failure. The systems documented in the PRACA system range from flight hardware to ground or facility support equipment. While the PRACA system is complex, it does possess all the failure modes, times of occurrence, length of system delay, parts repaired or replaced, and corrective action performed. The difficulty is mining the data then to utilize that data in order to estimate component, Line Replaceable Unit (LRU), and system reliability analysis metrics. In this paper, we identify a methodology to categorize qualitative data from the ground system PRACA data base for common ground or facility support equipment. Then utilizing a heuristic developed for review of the PRACA data determine what reports identify a credible failure. These data are the used to determine inter-arrival times to perform an estimation of a metric for repairable component-or LRU reliability. This analysis is used to determine failure modes of the equipment, determine the probability of the component failure mode, and support various quantitative differing techniques for performing repairable system analysis. The result is that an effective and concise estimate of components used in manned space flight operations. The advantage is the components or LRU's are evaluated in the same environment and condition that occurs during the launch process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rusu, I; Thomas, T; Roeske, J
Purpose: To identify areas of improvement in our liver stereotactic body radiation therapy (SBRT) program, using failure mode and effect analysis (FMEA). Methods: A multidisciplinary group consisting of one physician, three physicists, one dosimetrist and two therapists was formed. A process map covering 10 major stages of the liver SBRT program from the initial diagnosis to post treatment follow-up was generated. A total of 102 failure modes, together with their causes and effects, were identified. The occurrence (O), severity (S) and lack of detectability (D) were independently scored. The ranking was done using the risk probability number (RPN) defined asmore » the product of average O, S and D numbers for each mode. The scores were normalized to remove inter-observer variability, while preserving individual ranking order. Further, a correlation analysis on the overall agreement on rank order of all failure modes resulted in positive values for successive pairs of evaluators. The failure modes with the highest RPN value were considered for further investigation. Results: The average normalized RPN values for all modes were 39 with a range of 9 to 103. The FMEA analysis resulted in the identification of the top 10 critical failures modes as: Incorrect CT-MR registration, MR scan not performed in treatment position, patient movement between CBCT acquisition and treatment, daily IGRT QA not verified, incorrect or incomplete ITV delineation, OAR contours not verified, inaccurate normal liver effective dose (Veff) calculation, failure of bolus tracking for 4D CT scan, setup instructions not followed for treatment and plan evaluation metrics missed. Conclusion: The application of FMEA to our liver SBRT program led to the identification and possible improvement of areas affecting patient safety.« less
Daker-White, Gavin; Hays, Rebecca; Esmail, Aneez; Minor, Brian; Barlow, Wendy; Brown, Benjamin; Blakeman, Thomas; Bower, Peter
2014-01-01
Introduction Increasing numbers of older people are living with multiple long-term health conditions but global healthcare systems and clinical guidelines have traditionally focused on the management of single conditions. Having two or more long-term conditions, or ‘multimorbidity’, is associated with a range of adverse consequences and poor outcomes and could put patients at increased risk of safety failures. Traditionally, most research into patient safety failures has explored hospital or inpatient settings. Much less is known about patient safety failures in primary care. Our core aims are to understand the mechanisms by which multimorbidity leads to safety failures, to explore the different ways in which patients and services respond (or fail to respond), and to identify opportunities for intervention. Methods and analysis We plan to undertake an applied ethnographic study of patients with multimorbidity. Patients’ interactions and environments, relevant to their healthcare, will be studied through observations, diary methods and semistructured interviews. A framework, based on previous studies, will be used to organise the collection and analysis of field notes, observations and other qualitative data. This framework includes the domains: access breakdowns, communication breakdowns, continuity of care errors, relationship breakdowns and technical errors. Ethics and dissemination Ethical approval was received from the National Health Service Research Ethics Committee for Wales. An individual case study approach is likely to be most fruitful for exploring the mechanisms by which multimorbidity leads to safety failures. A longitudinal and multiperspective approach will allow for the constant comparison of patient, carer and healthcare worker expectations and experiences related to the provision, integration and management of complex care. This data will be used to explore ways of engaging patients and carers more in their own care using shared decision-making, patient empowerment or other relevant models. PMID:25138807
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younge, K C; Lee, C I; Feng, M
2015-06-15
Purpose: To improve the safety and quality of a dual-vendor microsphere brachytherapy program with failure mode and effects analysis (FMEA). Methods: A multidisciplinary team including physicists, dosimetrists, a radiation oncologist, an interventional radiologist, and radiation safety personnel performed an FMEA for our dual-vendor microsphere brachytherapy program employing SIR-Spheres (Sirtex Medical Limited, Australia) and Theraspheres (BTG, England). We developed a program process tree and step-by-step instructions which were used to generate a comprehensive list of failure modes. These modes were then ranked according to severity, occurrence rate, and detectability. Risk priority numbers (RPNs) were calculated by multiplying these three scores together.more » Three different severity scales were created: one each for harmful effects to the patient, staff, or the institution. Each failure mode was ranked on one or more of these scales. Results: The group identified 164 failure modes for the microsphere program. 113 of these were ranked using the patient severity scale, 52 using the staff severity scale, and 50 using the institution severity scale. The highest ranked items on the patient severity scale were an error in the automated dosimetry worksheet (RPN = 297.5), and the incorrect target specified on the planning study (RPN = 135). Some failure modes ranked differently between vendors, especially those corresponding to dose vial preparation because of the different methods used. Based on our findings, we made several improvements to our QA program, including documentation to easily identify which product is being used, an additional hand calculation during planning, and reorganization of QA steps before treatment delivery. We will continue to periodically review and revise the FMEA. Conclusion: We have applied FMEA to our dual-vendor microsphere brachytherapy program to identify potential key weaknesses in the treatment chain. Our FMEA results were used to improve the effectiveness of our overall microsphere program.« less
Survivorship analysis of failure pattern after revision total hip arthroplasty.
Retpen, J B; Varmarken, J E; Jensen, J S
1989-12-01
Failure, defined as established indication for or performed re-revision of one or both components, was analyzed using survivorship methods in 306 revision total hip arthroplasties. The longevity of revision total hip arthroplasties was inferior to that of previously reported primary total hip arthroplasties. The overall survival curve was two-phased, with a late failure period associated with aseptic loosening of one or both components and an early failure period associated with causes of failure other than loosening. Separate survival curves for aseptic loosening of femoral and acetabular components showed late and almost simultaneous decline, but with a tendency toward a higher rate of failure for the femoral component. No differences in survival could be found between the Stanmore, Lubinus standard, and Lubinus long-stemmed femoral components. A short interval between the index operation and the revision and intraoperative and postoperative complications were risk factors for early failure. Young age was a risk factor for aseptic loosening of the femoral component. Intraoperative fracture of the femoral shaft was not a risk factor for secondary loosening. No difference in survival was found between primary cemented total arthroplasty and primary noncemented hemiarthroplasty.
An experimental evaluation of software redundancy as a strategy for improving reliability
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Caglayan, Alper K.; Knight, John C.; Lee, Larry D.; Mcallister, David F.; Vouk, Mladen A.; Kelly, John P. J.
1990-01-01
The strategy of using multiple versions of independently developed software as a means to tolerate residual software design faults is suggested by the success of hardware redundancy for tolerating hardware failures. Although, as generally accepted, the independence of hardware failures resulting from physical wearout can lead to substantial increases in reliability for redundant hardware structures, a similar conclusion is not immediate for software. The degree to which design faults are manifested as independent failures determines the effectiveness of redundancy as a method for improving software reliability. Interest in multi-version software centers on whether it provides an adequate measure of increased reliability to warrant its use in critical applications. The effectiveness of multi-version software is studied by comparing estimates of the failure probabilities of these systems with the failure probabilities of single versions. The estimates are obtained under a model of dependent failures and compared with estimates obtained when failures are assumed to be independent. The experimental results are based on twenty versions of an aerospace application developed and certified by sixty programmers from four universities. Descriptions of the application, development and certification processes, and operational evaluation are given together with an analysis of the twenty versions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, T. H.; Robinson, W. R.; Holland, J. W.
1989-12-01
Results and analyses of margin to cladding failure and pre-failure axial expansion of metallic fuel are reported for TREAT in-pile transient overpower tests M5--M7. These are the first such tests on reference binary and ternary alloy fuel of the Integral Fast Reactor (IFR) concept with burnup ranging from 1 to 10 at. %. In all cases, test fuel was subjected to an exponential power rise on an 8 s period until either incipient or actual cladding failure was achieved. Objectives, designs and methods are described with emphasis on developments unique to metal fuel safety testing. The resulting database for claddingmore » failure threshold and prefailure fuel expansion is presented. The nature of the observed cladding failure and resultant fuel dispersals is described. Simple models of cladding failures and pre-failure axial expansions are described and compared with experimental results. Reported results include: temperature, flow, and pressure data from test instrumentation; fuel motion diagnostic data principally from the fast neutron hodoscope; and test remains described from both destructive and non-destructive post-test examination. 24 refs., 144 figs., 17 tabs.« less
NASA Technical Reports Server (NTRS)
1974-01-01
Future operational concepts for the space transportation system were studied in terms of space shuttle upper stage failure contingencies possible during deployment, retrieval, or space servicing of automated satellite programs. Problems anticipated during mission planning were isolated using a modified 'fault tree' technique, normally used in safety analyses. A comprehensive space servicing hazard analysis is presented which classifies possible failure modes under the catagories of catastrophic collision, failure to rendezvous and dock, servicing failure, and failure to undock. The failure contingencies defined are to be taken into account during design of the upper stage.
WE-G-BRA-08: Failure Modes and Effects Analysis (FMEA) for Gamma Knife Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Y; Bhatnagar, J; Bednarz, G
2015-06-15
Purpose: To perform a failure modes and effects analysis (FMEA) study for Gamma Knife (GK) radiosurgery processes at our institution based on our experience with the treatment of more than 13,000 patients. Methods: A team consisting of medical physicists, nurses, radiation oncologists, neurosurgeons at the University of Pittsburgh Medical Center and an external physicist expert was formed for the FMEA study. A process tree and a failure mode table were created for the GK procedures using the Leksell GK Perfexion and 4C units. Three scores for the probability of occurrence (O), the severity (S), and the probability of no detectionmore » (D) for failure modes were assigned to each failure mode by each professional on a scale from 1 to 10. The risk priority number (RPN) for each failure mode was then calculated (RPN = OxSxD) as the average scores from all data sets collected. Results: The established process tree for GK radiosurgery consists of 10 sub-processes and 53 steps, including a sub-process for frame placement and 11 steps that are directly related to the frame-based nature of the GK radiosurgery. Out of the 86 failure modes identified, 40 failure modes are GK specific, caused by the potential for inappropriate use of the radiosurgery head frame, the imaging fiducial boxes, the GK helmets and plugs, and the GammaPlan treatment planning system. The other 46 failure modes are associated with the registration, imaging, image transfer, contouring processes that are common for all radiation therapy techniques. The failure modes with the highest hazard scores are related to imperfect frame adaptor attachment, bad fiducial box assembly, overlooked target areas, inaccurate previous treatment information and excessive patient movement during MRI scan. Conclusion: The implementation of the FMEA approach for Gamma Knife radiosurgery enabled deeper understanding of the overall process among all professionals involved in the care of the patient and helped identify potential weaknesses in the overall process.« less
Yang, F; Cao, N; Young, L; Howard, J; Logan, W; Arbuckle, T; Sponseller, P; Korssjoen, T; Meyer, J; Ford, E
2015-06-01
Though failure mode and effects analysis (FMEA) is becoming more widely adopted for risk assessment in radiation therapy, to our knowledge, its output has never been validated against data on errors that actually occur. The objective of this study was to perform FMEA of a stereotactic body radiation therapy (SBRT) treatment planning process and validate the results against data recorded within an incident learning system. FMEA on the SBRT treatment planning process was carried out by a multidisciplinary group including radiation oncologists, medical physicists, dosimetrists, and IT technologists. Potential failure modes were identified through a systematic review of the process map. Failure modes were rated for severity, occurrence, and detectability on a scale of one to ten and risk priority number (RPN) was computed. Failure modes were then compared with historical reports identified as relevant to SBRT planning within a departmental incident learning system that has been active for two and a half years. Differences between FMEA anticipated failure modes and existing incidents were identified. FMEA identified 63 failure modes. RPN values for the top 25% of failure modes ranged from 60 to 336. Analysis of the incident learning database identified 33 reported near-miss events related to SBRT planning. Combining both methods yielded a total of 76 possible process failures, of which 13 (17%) were missed by FMEA while 43 (57%) identified by FMEA only. When scored for RPN, the 13 events missed by FMEA ranked within the lower half of all failure modes and exhibited significantly lower severity relative to those identified by FMEA (p = 0.02). FMEA, though valuable, is subject to certain limitations. In this study, FMEA failed to identify 17% of actual failure modes, though these were of lower risk. Similarly, an incident learning system alone fails to identify a large number of potentially high-severity process errors. Using FMEA in combination with incident learning may render an improved overview of risks within a process.
NASA Astrophysics Data System (ADS)
Vicuña, Cristián Molina; Höweler, Christoph
2017-12-01
The use of AE in machine failure diagnosis has increased over the last years. Most AE-based failure diagnosis strategies use digital signal processing and thus require the sampling of AE signals. High sampling rates are required for this purpose (e.g. 2 MHz or higher), leading to streams of large amounts of data. This situation is aggravated if fine resolution and/or multiple sensors are required. These facts combine to produce bulky data, typically in the range of GBytes, for which sufficient storage space and efficient signal processing algorithms are required. This situation probably explains why, in practice, AE-based methods consist mostly in the calculation of scalar quantities such as RMS and Kurtosis, and the analysis of their evolution in time. While the scalar-based approach offers the advantage of maximum data reduction; it has the disadvantage that most part of the information contained in the raw AE signal is lost unrecoverably. This work presents a method offering large data reduction, while keeping the most important information conveyed by the raw AE signal, useful for failure detection and diagnosis. The proposed method consist in the construction of a synthetic, unevenly sampled signal which envelopes the AE bursts present on the raw AE signal in a triangular shape. The constructed signal - which we call TriSignal - also permits the estimation of most scalar quantities typically used for failure detection. But more importantly, it contains the information of the time of occurrence of the bursts, which is key for failure diagnosis. Lomb-Scargle normalized periodogram is used to construct the TriSignal spectrum, which reveals the frequency content of the TriSignal and provides the same information as the classic AE envelope. The paper includes application examples in planetary gearbox and low-speed rolling element bearing.
A Micromechanics-Based Method for Multiscale Fatigue Prediction
NASA Astrophysics Data System (ADS)
Moore, John Allan
An estimated 80% of all structural failures are due to mechanical fatigue, often resulting in catastrophic, dangerous and costly failure events. However, an accurate model to predict fatigue remains an elusive goal. One of the major challenges is that fatigue is intrinsically a multiscale process, which is dependent on a structure's geometric design as well as its material's microscale morphology. The following work begins with a microscale study of fatigue nucleation around non- metallic inclusions. Based on this analysis, a novel multiscale method for fatigue predictions is developed. This method simulates macroscale geometries explicitly while concurrently calculating the simplified response of microscale inclusions. Thus, providing adequate detail on multiple scales for accurate fatigue life predictions. The methods herein provide insight into the multiscale nature of fatigue, while also developing a tool to aid in geometric design and material optimization for fatigue critical devices such as biomedical stents and artificial heart valves.
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compressive strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compressive loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compression strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compression loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
Failure Forecasting in Triaxially Stressed Sandstones
NASA Astrophysics Data System (ADS)
Crippen, A.; Bell, A. F.; Curtis, A.; Main, I. G.
2017-12-01
Precursory signals to fracturing events have been observed to follow power-law accelerations in spatial, temporal, and size distributions leading up to catastrophic failure. In previous studies this behavior was modeled using Voight's relation of a geophysical precursor in order to perform `hindcasts' by solving for failure onset time. However, performing this analysis in retrospect creates a bias, as we know an event happened, when it happened, and we can search data for precursors accordingly. We aim to remove this retrospective bias, thereby allowing us to make failure forecasts in real-time in a rock deformation laboratory. We triaxially compressed water-saturated 100 mm sandstone cores (Pc= 25MPa, Pp = 5MPa, σ = 1.0E-5 s-1) to the point of failure while monitoring strain rate, differential stress, AEs, and continuous waveform data. Here we compare the current `hindcast` methods on synthetic and our real laboratory data. We then apply these techniques to increasing fractions of the data sets to observe the evolution of the failure forecast time with precursory data. We discuss these results as well as our plan to mitigate false positives and minimize errors for real-time application. Real-time failure forecasting could revolutionize the field of hazard mitigation of brittle failure processes by allowing non-invasive monitoring of civil structures, volcanoes, and possibly fault zones.
Medical versus surgical abortion methods for pregnancy in China: a cost-minimization analysis.
Xia, Wei; She, Shouzhang; Lam, Tai Hing
2011-01-01
Both medical and surgical abortions are popular in developing countries. However, the monetary costs of these two methods have not been compared. 430 women seeking abortions were recruited in 2008. Either a medical or surgical method was used for the abortion. We adopted the perspective of a third-party payer. Cost-minimization analysis was used based on all charges for the overall procedures in an out-patient clinic in Guangzhou, China. 219 subjects (51%) chose a medical method (mifepristone and misoprostol), whereas 211 subjects (49%) chose a surgical method. The efficacy in the surgical group was significantly higher than in the medical group (100 vs. 90%, p < 0.001). Surgical abortion incurred much more costs than medical abortion on average after initial treatment. When the subsequent costs were accumulated within the 2-week follow-up, the mean total cost in the medical group increased significantly due to failure of abortion and persistent bleeding. Patients undergoing medical abortion eventually incurred equivalent expenses compared to patients undergoing surgical abortion (p = 0.42). There was no difference in the mean final costs between the two abortion methods. Complications of persistent bleeding and failure to abort (requiring surgical intervention) in the medical treatment group increased the final mean total cost substantially. Copyright © 2011 S. Karger AG, Basel.
The effect of fatigue cracks on fastener flexibility, load distribution, and fatigue crack growth
NASA Astrophysics Data System (ADS)
Whitman, Zachary Layne
Fatigue cracks typically occur at stress risers such as geometry changes and holes. This type of failure has serious safety and economic repercussions affecting structures such as aircraft. The need to prevent catastrophic failure due to fatigue cracks and other discontinuities has led to durability and damage tolerant methodologies influencing the design of aircraft structures. Holes in a plate or sheet filled with a fastener are common fatigue critical locations in aircraft structure requiring damage tolerance analysis (DTA). Often, the fastener is transferring load which leads to a loading condition involving both far-field stresses such as tension and bending, and localized bearing at the hole. The difference between the bearing stress and the tensile field at the hole is known as load transfer. The ratio of load transfer as well as the magnitude of the stresses plays a significant part in how quickly a crack will progress to failure. Unfortunately, the determination of load transfer in a complex joint is far from trivial. Many methods exist in the open literature regarding the analysis of splices, doublers and attachment joints to determine individual fastener loads. These methods work well for static analyses but greater refinement is needed for crack growth analysis. The first fastener in a splice or joint is typically the most critical but different fastener flexibility equations will all give different results. The constraint of the fastener head and shop end, along with the type of fastener, affects the stiffness or flexibility of the fastener. This in turn will determine the load that the fastener will transfer within a given fastener pattern. However, current methods do not account for the change in flexibility at a fastener as the crack develops. It is put forth that a crack does indeed reduce the stiffness of a fastener by changing its constraint, thus lessening the load transfer. A crack growth analysis utilizing reduced load transfer will result in a slower growing crack versus an analysis that ignores the effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harry, T; Yaddanapudi, S; Mutic, S
Purpose: New techniques and materials have recently been developed to expedite the conventional Linac Acceptance Testing Procedure (ATP). The new ATP method uses the Electronic Portal Imaging Device (EPID) for data collection and is presented separately. This new procedure is meant to be more efficient then conventional methods. While not clinically implemented yet, a prospective risk assessment is warranted for any new techniques. The purpose of this work is to investigate the risks and establish the pros and cons between the conventional approach and the new ATP method. Methods: ATP tests that were modified and performed with the EPID weremore » analyzed. Five domain experts (Medical Physicists) comprised the core analysis team. Ranking scales were adopted from previous publications related to TG 100. The number of failure pathways for each ATP test procedure were compared as well as the number of risk priority numbers (RPN’s) greater than 100 were compared. Results: There were fewer failure pathways with the new ATP compared to the conventional, 262 and 556, respectively. There were fewer RPN’s > 100 in the new ATP compared to the conventional, 41 and 115. Failure pathways and RPN’s > 100 for individual ATP tests on average were 2 and 3.5 times higher in the conventional ATP compared to the new, respectively. The pixel sensitivity map of the EPID was identified as a key hazard to the new ATP procedure with an RPN of 288 for verifying beam parameters. Conclusion: The significant decrease in failure pathways and RPN’s >100 for the new ATP mitigates the possibilities of a catastrophic error occurring. The Pixel Sensitivity Map determining the response and inherent characteristics of the EPID is crucial as all data and hence results are dependent on that process. Grant from Varian Medical Systems Inc.« less
dos Reis, Helena França Correia; Almeida, Mônica Lajana Oliveira; da Silva, Mário Ferreira; Rocha, Mário de Seixas
2013-01-01
OBJECTIVE: To evaluate the association between extubation failure and outcomes (clinical and functional) in patients with traumatic brain injury (TBI). METHODS: A prospective cohort study involving 311 consecutive patients with TBI. The patients were divided into two groups according to extubation outcome: extubation success; and extubation failure (defined as reintubation within 48 h after extubation). A multivariate model was developed in order to determine whether extubation failure was an independent predictor of in-hospital mortality. RESULTS: The mean age was 35.7 ± 13.8 years. Males accounted for 92.3%. The incidence of extubation failure was 13.8%. In-hospital mortality was 4.5% and 20.9% in successfully extubated patients and in those with extubation failure, respectively (p = 0.001). Tracheostomy was more common in the extubation failure group (55.8% vs. 1.9%; p < 0.001). The median length of hospital stay was significantly greater in the extubation failure group than in the extubation success group (44 days vs. 27 days; p = 0.002). Functional status at discharge was worse among the patients in the extubation failure group. The multivariate analysis showed that extubation failure was an independent predictor of in-hospital mortality (OR = 4.96; 95% CI, 1.86-13.22). CONCLUSIONS: In patients with TBI, extubation failure appears to lengthen hospital stays; to increase the frequency of tracheostomy and of pulmonary complications; to worsen functional outcomes; and to increase mortality. PMID:23857695
Steinberger, Dina M; Douglas, Stephen V; Kirschbaum, Mark S
2009-09-01
A multidisciplinary team from the University of Wisconsin Hospital and Clinics transplant program used failure mode and effects analysis to proactively examine opportunities for communication and handoff failures across the continuum of care from organ procurement to transplantation. The team performed a modified failure mode and effects analysis that isolated the multiple linked, serial, and complex information exchanges occurring during the transplantation of one solid organ. Failure mode and effects analysis proved effective for engaging a diverse group of persons who had an investment in the outcome in analysis and discussion of opportunities to improve the system's resilience for avoiding errors during a time-pressured and complex process.
Comprehension and retrieval of failure cases in airborne observatories
NASA Technical Reports Server (NTRS)
Alvarado, Sergio J.; Mock, Kenrick J.
1995-01-01
This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.
Comprehension and retrieval of failure cases in airborne observatories
NASA Astrophysics Data System (ADS)
Alvarado, Sergio J.; Mock, Kenrick J.
1995-05-01
This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.
The Range Safety Debris Catalog Analysis in Preparation for the Pad Abort One Flight Test
NASA Technical Reports Server (NTRS)
Kutty, Prasad M.; Pratt, William D.
2010-01-01
The Pad Abort One flight test of the Orion Abort Flight Test Program is currently under development with the goal of demonstrating the capability of the Launch Abort System. In the event of a launch failure, this system will propel the Crew Exploration Vehicle to safety. An essential component of this flight test is range safety, which ensures the security of range assets and personnel. A debris catalog analysis was done as part of a range safety data package delivered to the White Sands Missile Range in New Mexico where the test will be conducted. The analysis discusses the consequences of an overpressurization of the Abort Motor. The resulting structural failure was assumed to create a debris field of vehicle fragments that could potentially pose a hazard to the range. A statistical model was used to assemble the debris catalog of potential propellant fragments. Then, a thermodynamic, energy balance model was applied to the system in order to determine the imparted velocity to these propellant fragments. This analysis was conducted at four points along the flight trajectory to better understand the failure consequences over the entire flight. The methods used to perform this analysis are outlined in detail and the corresponding results are presented and discussed.
Foreign Object Damage to Tires Operating in a Wartime Environment
1991-11-01
barriers were successfully overcome and the method of testing employed can now be confidently used for future test needs of this type. Data Analysis ...combined variable effects. Analysis consideration involved cut types, cut depths, number of cuts, cut/hit probabilities, tire failures, and aircraft...November 1988 with data reduction and analysis continuing into October 1989. All of the cutting tests reported in this report were conducted at the
An Experimental Study of Launch Vehicle Propellant Tank Fragmentation
NASA Technical Reports Server (NTRS)
Richardson, Erin; Jackson, Austin; Hays, Michael; Bangham, Mike; Blackwood, James; Skinner, Troy; Richman, Ben
2014-01-01
In order to better understand launch vehicle abort environments, Bangham Engineering Inc. (BEi) built a test assembly that fails sample materials (steel and aluminum plates of various alloys and thicknesses) under quasi-realistic vehicle failure conditions. Samples are exposed to pressures similar to those expected in vehicle failure scenarios and filmed at high speed to increase understanding of complex fracture mechanics. After failure, the fragments of each test sample are collected, catalogued and reconstructed for further study. Post-test analysis shows that aluminum samples consistently produce fewer fragments than steel samples of similar thickness and at similar failure pressures. Video analysis shows that there are several failure 'patterns' that can be observed for all test samples based on configuration. Fragment velocities are also measured from high speed video data. Sample thickness and material are analyzed for trends in failure pressure. Testing is also done with cryogenic and noncryogenic liquid loading on the samples. It is determined that liquid loading and cryogenic temperatures can decrease material fragmentation for sub-flight thicknesses. A method is developed for capture and collection of fragments that is greater than 97 percent effective in recovering sample mass, addressing the generation of tiny fragments. Currently, samples tested do not match actual launch vehicle propellant tank material thicknesses because of size constraints on test assembly, but test findings are used to inform the design and build of another, larger test assembly with the purpose of testing actual vehicle flight materials that include structural components such as iso-grid and friction stir welds.
14 CFR 417.309 - Flight safety system analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... system anomaly occurring and all of its effects as determined by the single failure point analysis and... termination system. (c) Single failure point. A command control system must undergo an analysis that... fault tree analysis or a failure modes effects and criticality analysis; (2) Identify all possible...
14 CFR 417.309 - Flight safety system analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... system anomaly occurring and all of its effects as determined by the single failure point analysis and... termination system. (c) Single failure point. A command control system must undergo an analysis that... fault tree analysis or a failure modes effects and criticality analysis; (2) Identify all possible...
14 CFR 417.309 - Flight safety system analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... system anomaly occurring and all of its effects as determined by the single failure point analysis and... termination system. (c) Single failure point. A command control system must undergo an analysis that... fault tree analysis or a failure modes effects and criticality analysis; (2) Identify all possible...